[{"url": "https://hive.blog/hive-112019/@spknetwork/zznhxmqq", "probability": 0.6007584, "headline": "SPK Network Team Meeting Recording #1", "datePublished": "2023-04-21T04:23:38.043960", "datePublishedRaw": "3 days ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafybeieqfz64zubiky6awfgyvvjanigm6rno5dg2jcj4nv7xkr2p22i3mu", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tmDmbuy6SvS8ktfSayxCGeNgNEog6KKWXrT66uSc7EXnfsqy3QMqyiNQ6acE6NFiCS5.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png"], "description": "\u25b6\ufe0f Watch on 3Speak This is the recording of today's meeting. We plan to record every week on Thursdays at 20:00 UTC, so stay tuned. The live meeting occurs\u2026 by spknetwork", "articleBody": "\u25b6\ufe0f Watch on 3Speak", "articleBodyHtml": "
\n\n
\n\n

\u25b6\ufe0f Watch on 3Speak

\n\n
", "canonicalUrl": "https://hive.blog/hive-112019/@spknetwork/zznhxmqq"},{"url": "https://hive.blog/hive-167922/@spknetwork.chat/yrftirib", "probability": 0.800691, "headline": "dLux - Decentralized layer 2 Solution", "datePublished": "2023-02-24T04:23:46.187694", "datePublishedRaw": "2 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafybeidmzofrht6edfi3n6mn2og3acy7iep5w2ezgdscp4yuy66yyrnpva", "images": ["https://3speak.tv/embed?v=spknetwork.chat/yrftirib"], "description": "\u25b6\ufe0f Watch on 3Speak dLux is a place where users can post applications instead of just videos, blogs and pictures. The idea is to create an App Store that pays users\u2026 by spknetwork.chat", "articleBody": "dLux is a place where users can post applications instead of just videos, blogs and pictures.\nThe idea is to create an App Store that pays users instead of taking a 30% cut.\nHoney-comb aims to scale layer twos and make it more affordable to do things that shouldn't be expensive.\nThe idea is to create side chains where groups of people can start their own tokens and smart contracts to run their communities, games or co-ops.\nHoney-comb ecosystem is expected to provide services where people can run nodes for breakaway communities.\n\nThe goal is to provide an easy and lightweight way to decentralize token distribution to communities.", "articleBodyHtml": "
\n\n
\n

\u25b6\ufe0f Watch on 3Speak

\n\n

dLux is a place where users can post applications instead of just videos, blogs and pictures.
\nThe idea is to create an App Store that pays users instead of taking a 30% cut.
\nHoney-comb aims to scale layer twos and make it more affordable to do things that shouldn't be expensive.
\nThe idea is to create side chains where groups of people can start their own tokens and smart contracts to run their communities, games or co-ops.
\nHoney-comb ecosystem is expected to provide services where people can run nodes for breakaway communities.

\n\n

The goal is to provide an easy and lightweight way to decentralize token distribution to communities.

\n\n
", "canonicalUrl": "https://hive.blog/hive-167922/@spknetwork.chat/yrftirib"},{"url": "https://hive.blog/hive-112019/@spknetwork/guwdglwd", "probability": 0.7041709, "headline": "Multiparty State Channels - SPK Network Team Meeting", "datePublished": "2023-02-24T04:23:47.671540", "datePublishedRaw": "2 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafybeidainpgyp3osdpg5ipc5qpydihjhpkurjzpmzsuqdp3gt4aaejhfu", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tmDhCSmjpwQjNMfzeGQ8u7EakqfN41Lyr1ENDptyPC7STvRTrkVAEvjwYJNDvPi9RAi.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png", "https://3speak.tv/embed?v=spknetwork/guwdglwd"], "description": "\u25b6\ufe0f Watch on 3Speak Join us as we discuss some design choices and challenges with building multi-party transactions. We are hoping the community has every\u2026 by spknetwork", "articleBody": "\u25b6\ufe0f Watch on 3Speak\n\nJoin us as we discuss some design choices and challenges with building multi-party transactions. We are hoping the community has every opportunity to understand and contribute feedback. For your clarity the code below is what's being referenced in the video... with some additional //comments.\n\n/* json => from => 3spk or account with broca to => person who can upload a file broker => account that can recieve an upload broca => amount of broca to place into contract */ exports.channel_open = (json, from, active, pc) => { //the contract json:payload, from:@hiveaccount, active: active key used, pc:promise chain(for contract ordering) if (active && json.to && json.broker){ var Pbroca = getPathNum([\"broca\", from]); //read memory to compute contract var Pproffer = getPathObj([json.contract,'proffer', from, json.to]) var Pstats = getPathObj([\"stats\"]) var PauthB = getPathObj([\"authorities\", from]) var PauthT = getPathObj([\"authorities\", json.to]); var PauthF = getPathObj([\"authorities\", json.from]); var Ptemplate = getPathObj([\"template\", json.contract]); Promise.all([Pbroca, Pproffer, Pstats, PauthF, PauthT, PauthB, Ptemplate]).then(mem => { var broca = mem[0], proffer = mem[1], stats = mem[2], authF = mem[3], authT = mem[4], authB = mem[5], template = mem[6], ops = [], err = '' //no log no broca? broca = broca_calc(broca, stats, json.block_num) //function is below if (typeof template.i != \"string\")err += `Contract doesn't exist.` if (typeof authF != 'string')err += `@${from} hasn't registered a public key. ` if (typeof authT != \"string\")err += `@${json.to} hasn't registered a public key. `; if (typeof authB != \"string\")err += `@${json.broker} hasn't registered a public key. `; if (proffer.ex)err += `This channel exists: ${proffer.ex.split(':')[1]} ` if (json.broca > broca.b || json.broca < stats.channel_min)err +=`@${from} doesn't have enough BROCA to build a channel`; if (!err) { proffer.t=json.to //to proffer.f=from //from proffer.b=json.broker //broker proffer.r=parseInt(json.broca) //resource credit proffer.a=parseInt( (json.broca / stats.channel_min) * stats.channel_bytes ); proffer.s=1 //status codes 1: exists; 2: following steps broca.b -=parseInt(json.broca); chronAssign(parseInt(json.block_num + 28800 ), { //builds a \"virtual op\" to process contract phase expiration block: parseInt(json.block_num + 28800 ), op: 'channel_check' , from, to: json.to, c: json.contract, ensure: 1 }).then(exp_path=>{ proffer.e = exp_path ops.push({ type: \"put\", path: [\"broca\", from], data: broca, }); const msg = `@${json.to} authorized to upload ${proffer.a} bytes to @${json.broker} by @${from} for ${parseFloat(json.broca / 1000).toFixed(3)} BROCA`; ops.push({ type: \"put\", path: [\"feed\", `${json.block_num}:${json.transaction_id}`], data: msg, }); if (config.hookurl || config.status) postToDiscord(msg, `${json.block_num}:${json.transaction_id}`); ops.push({ type: \"put\", path: [json.contract, 'proffer', from, json.to], data: proffer, }); if (process.env.npm_lifecycle_event == \"test\") pc[2] = ops; store.batch(ops); }) } else { ops.push({ type: \"put\", path: [\"feed\", `${json.block_num}:${json.transaction_id}`], data: err, }); if (config.hookurl || config.status) postToDiscord(err, `${json.block_num}:${json.transaction_id}`); if (process.env.npm_lifecycle_event == \"test\") pc[2] = ops; store.batch(ops); // stores memory, called the next contract in the promise chain } }) } else { pc[0](pc[2]); //ignores this failed contract, processes the next contract in the block. } }; exports.register_authority = (json, from, active, pc) => {//the contract json:payload, from:@hiveaccount, active: active key used, pc:promise chain(for contract ordering) if ( active && json.pubKey && typeof json.pubKey == \"string\" && json.pubKey.subStr(0, 3) == \"STM\" && json.pubKey.length == 53 ) { var ops = [{ type: \"put\", path: [\"authorities\", from], data: json.pubKey }]; store.batch(ops, pc); } else { pc[0](pc[2]); } }; const broca_calc = (obj, stats, block_num) => { //broca calculation called by many contracts const last_calc = require('./helpers').Base64.toNumber(obj.t) //decode terse memory const max = require(\"./helpers\").Base64.toNumber(obj.m); const accured = parseInt((parseFloat(stats.broca_refill) * (block_num - last_calc))/max) obj.b += accured if(obj.b > max)obj.b = max //places the current earned broca into the object, that will only be updated upon contract completion. This process saves unnecessary overhead for idle accounts. obj.t = require(\"./helpers\").Base64.fromNumber(block_num); return obj }\n\n![image.png](\n\nAbout the SPK Network:\n\nThe SPK Network is a decentralized Web 3.0 protocol that rewards value creators and infrastructure providers appropriately and autonomously by distributing reward tokens so that every user, creator, and platform, will be able to earn rewards on a level playing field.", "articleBodyHtml": "
\n\n
\n\n

\u25b6\ufe0f Watch on 3Speak

\n\n
\"multiparty.png\"
\n\n

Join us as we discuss some design choices and challenges with building multi-party transactions. We are hoping the community has every opportunity to understand and contribute feedback. For your clarity the code below is what's being referenced in the video... with some additional //comments.

\n\n
\n/*\njson => \nfrom => 3spk or account with broca\nto => person who can upload a file\nbroker => account that can recieve an upload\nbroca => amount of broca to place into contract\n*/\n\nexports.channel_open = (json, from, active, pc) => { //the contract json:payload, from:@hiveaccount, active: active key used, pc:promise chain(for contract ordering)\n  if (active && json.to && json.broker){ \n    var Pbroca = getPathNum([\"broca\", from]); //read memory to compute contract\n    var Pproffer = getPathObj([json.contract,'proffer', from, json.to])\n    var Pstats = getPathObj([\"stats\"])\n    var PauthB = getPathObj([\"authorities\", from])\n    var PauthT = getPathObj([\"authorities\", json.to]);\n    var PauthF = getPathObj([\"authorities\", json.from]);\n    var Ptemplate = getPathObj([\"template\", json.contract]);\n    Promise.all([Pbroca, Pproffer, Pstats, PauthF, PauthT, PauthB, Ptemplate]).then(mem => {\n        var broca = mem[0],\n            proffer = mem[1],\n            stats = mem[2],\n            authF = mem[3],\n            authT = mem[4],\n            authB = mem[5],\n            template = mem[6],\n            ops = [],\n            err = '' //no log no broca?\n        broca = broca_calc(broca, stats, json.block_num) //function is below\n        if (typeof template.i != \"string\")err += `Contract doesn't exist.`\n        if (typeof authF != 'string')err += `@${from} hasn't registered a public key. `\n        if (typeof authT != \"string\")err += `@${json.to} hasn't registered a public key. `;\n        if (typeof authB != \"string\")err += `@${json.broker} hasn't registered a public key. `;\n        if (proffer.ex)err += `This channel exists: ${proffer.ex.split(':')[1]} `\n        if (json.broca > broca.b || json.broca < stats.channel_min)err += `@${from} doesn't have enough BROCA to build a channel`;\n        if (!err) {\n            proffer.t = json.to //to\n            proffer.f = from //from\n            proffer.b = json.broker //broker\n            proffer.r = parseInt(json.broca) //resource credit\n            proffer.a = parseInt(\n              (json.broca / stats.channel_min) * stats.channel_bytes\n            );\n            proffer.s = 1 //status codes 1: exists; 2: following steps\n            broca.b -= parseInt(json.broca);\n            chronAssign(parseInt(json.block_num + 28800 ), { //builds a \"virtual op\" to process contract phase expiration\n              block: parseInt(json.block_num + 28800 ),\n              op: 'channel_check',\n              from,\n              to: json.to,\n              c: json.contract,\n              ensure: 1\n            }).then(exp_path=>{\n              proffer.e = exp_path\n              ops.push({\n                type: \"put\",\n                path: [\"broca\", from],\n                data: broca,\n              });\n              const msg = `@${json.to} authorized to upload ${proffer.a} bytes to @${json.broker} by @${from} for ${parseFloat(json.broca / 1000).toFixed(3)} BROCA`;\n              ops.push({\n                type: \"put\",\n                path: [\"feed\", `${json.block_num}:${json.transaction_id}`],\n                data: msg,\n              });\n              if (config.hookurl || config.status)\n                postToDiscord(msg, `${json.block_num}:${json.transaction_id}`);\n              ops.push({\n                type: \"put\",\n                path: [json.contract, 'proffer', from, json.to],\n                data: proffer,\n              });\n              if (process.env.npm_lifecycle_event == \"test\") pc[2] = ops;\n              store.batch(ops);\n            })\n\n        } else {\n        ops.push({\n            type: \"put\",\n            path: [\"feed\", `${json.block_num}:${json.transaction_id}`],\n            data: err,\n        });\n        if (config.hookurl || config.status)\n            postToDiscord(err, `${json.block_num}:${json.transaction_id}`);\n        if (process.env.npm_lifecycle_event == \"test\") pc[2] = ops;\n        store.batch(ops); // stores memory, called the next contract in the promise chain\n        }\n    })\n  } else {\n    pc[0](pc[2]); //ignores this failed contract, processes the next contract in the block.\n  }\n};\n\nexports.register_authority = (json, from, active, pc) => {//the contract json:payload, from:@hiveaccount, active: active key used, pc:promise chain(for contract ordering)\n  if (\n    active &&\n    json.pubKey &&\n    typeof json.pubKey == \"string\" &&\n    json.pubKey.subStr(0, 3) == \"STM\" && json.pubKey.length == 53 ) {\n    var ops = [{ type: \"put\", path: [\"authorities\", from], data: json.pubKey }];\n    store.batch(ops, pc);\n  } else {\n    pc[0](pc[2]);\n  }\n};\n\nconst broca_calc = (obj, stats, block_num) => { //broca calculation called by many contracts\n    const last_calc = require('./helpers').Base64.toNumber(obj.t) //decode terse memory\n    const max = require(\"./helpers\").Base64.toNumber(obj.m); \n    const accured = parseInt((parseFloat(stats.broca_refill) * (block_num - last_calc))/max)\n    obj.b += accured\n    if(obj.b > max)obj.b = max //places the current earned broca into the object, that will only be updated upon contract completion. This process saves unnecessary overhead for idle accounts. \n    obj.t = require(\"./helpers\").Base64.fromNumber(block_num);\n    return obj\n}\n\n
\n\n

![image.png](

\n\n

About the SPK Network:

\n\n

The SPK Network is a decentralized Web 3.0 protocol that rewards value creators and infrastructure providers appropriately and autonomously by distributing reward tokens so that every user, creator, and platform, will be able to earn rewards on a level playing field.

\n\n
", "canonicalUrl": "https://hive.blog/hive-112019/@spknetwork/guwdglwd"},{"url": "https://hive.blog/hive-112019/@spknetwork/pzblscua", "probability": 0.78161305, "headline": "Novel Voting Mechanism - SPK Network Team Meeting", "datePublished": "2023-02-24T04:23:49.587083", "datePublishedRaw": "2 months ago", "author": "spknetwork", "authorsList": ["spknetwork"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafybeicqfe7aqknbznfpgwye4b5zpmfoisleum46ga7p7hbsvqxlmc53lq", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tvcDoYEhV5T1hvznG7UmLJcv4GZ1qCBRdSPmn6xpvvkCXvufcLCSLAHXbE5pgQVAZJH.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png", "https://3speak.tv/embed?v=spknetwork/pzblscua"], "description": "\u25b6\ufe0f Watch on 3Speak We're buttoning up the next iteration of the SPK Network. Before we get there, we're discussing our new voting mechanism and hoping to\u2026 by spknetwork", "articleBody": "\u25b6\ufe0f Watch on 3Speak\n\nWe're buttoning up the next iteration of the SPK Network. Before we get there, we're discussing our new voting mechanism and hoping to get your feedback. This meeting is approaching an hour long, and below is all the code referenced directly in the video... with some additional //comments.\n\nexports.spk_vote = (json, from, active, pc) => { //the contract json:payload, from:@hiveaccount, active: active key used, pc:promise chain(for contract ordering) var ops = [] //holds raw memory instructions if (active) { //ensures active key // memory reads var powp = getPathNum([\"spow\", from]), // @from's powered spk tpowp = getPathNum([\"spow\", \"t\"]), // the total powered spk dpowp = getPathObj([\"spowd\", from]), // @from's downpower operations pointer votebp = getPathObj(['spkVote', from]), // @from's last vote information pstats = getPathNum(['stats']) // current network parameters Promise.all([powp, tpowp, dpowp, votebp, pstats]).then((mem) => { var stats = mem[4] const DAOString = mem[3].substring(mem[3].indexOf(\",\")), lastVote = Base64.toNumber(mem[3].split(\",\")[0]) ? Base64.toNumber(mem[3].split(\",\")[0]) : json.block_num - parseInt(stats.spk_cycle_length), thisVote = Base64.fromNumber(json.block_num) + \",\" + (DAOString ? DAOString : \"\"), //decoding terse memory ago = json.block_num - lastVote total = mem[1], power = mem[0] downs = Object.keys(mem[2]) var effective_power = power, effective_total, aValidator = false if(stats.validators?.[from]){ //determine if @from is a validator aValidator = true var powerVoted = 0 for (block of stats.power_voted){ powerVoted += stats.power_voted[block] } power = (total - powerVoted)/20 //or number of validators } if (!power){ ops.push({ type: \"put\", path: [\"feed\", `${json.block_num}:${json.transaction_id}`], data: `@${from}| Attempted SPK vote with no voting power`, }); store.batch(ops, pc); } else if(downs.length && !aValidator){ getPathObj(['chrono', downs[0]]).then(down =>{ // additional step to recover downpower information from pointer finish(down) }) } else { finish() } function finish(down_obj) { if(down_obj?.amount){ effective_power = power - down_obj.amount } if (ago < parseFloat(stats.spk_cycle_length))effective_power=parseInt(effective_power * (ago / stats.spk_cycle_length)) else if (ago> parseFloat(stats.spk_cycle_length) && ago < stats.spk_cycle_length * 2)effective_power=effective_power* parseInt( effective_power * (1 - ((ago - stats.spk_cycle_length) / stats.spk_cycle_length) / 2) ) else if (ago>= stats.spk_cycle_length * 2)effective_power = parseInt(effective_power/2) effective_total = effective_total - effective_power const voteWeight = parseFloat(effective_power/effective_total).toFixed(8) const decayWeight = parseFloat( (effective_total - effective_power) / effective_total ).toFixed(8); //verify inputs, adjust constants if(json.spk_cycle_length < 28800)json.spk_cycle_length=28800 if(json.spk_cycle_length> 3000000)json.spk_cycle_length = 3000000 if(json.dex_fee < 0)json.dex_fee=0 if(json.dex_fee> 0.1)json.dex_fee = \"0.1\" if(json.dex_max < 0)json.dex_max=0 if(json.dex_max> 100)json.dex_max = 100 if(json.dex_slope < 0)json.dex_slope=0 if(json.dex_slope> 100)json.dex_slope = 100 if(json.spk_rate_lpow < 0)json.spk_rate_lpow=0 if(json.spk_rate_lpow> stats.spk_rate_ldel)json.spk_rate_lpow = stats.spk_rate_ldel if(json.spk_rate_ldel > stats.spk_rate_lgov)json.spk_rate_lpow = stats.spk_rate_lgov if(json.spk_rate_ldel < stats.spk_rate_lpow)json.spk_rate_ldel=stats.spk_rate_lpow if(json.spk_rate_lgov> 0.1)json.spk_rate_lgov = \"0.1\" if(json.spk_rate_lgov < stats.spk_rate_ldel)json.spk_rate_lpow=stats.spk_rate_ldel if(json.max_coll_members> 100)json.max_coll_members = 100 if(json.max_coll_members < 15)json.max_coll_members=15 json.max_coll_members=parseInt(json.max_coll_members) //stats.item=(json.vote * voteWeight) + (decayWeight * stats.item) stats.spk_cycle_length=(json.spk_cycle_length * voteWeight) + (decayWeight * parseFloat(stats.spk_cycle_length))> 28800 ? parseFloat((json.spk_cycle_length * voteWeight) + (decayWeight * stats.spk_cycle_length).toFixed(6)) : 28800 stats.dex_fee = parseFloat((json.dex_fee * voteWeight) + (decayWeight * parseFloat(stats.dex_fee))).toFixed(6) stats.dex_max = parseFloat((json.dex_max * voteWeight) + (decayWeight * parseFloat(stats.dex_max))).toFixed(2) stats.dex_slope = parseFloat((json.dex_slope * voteWeight) + (decayWeight * parseFloat(stats.dex_slope))).toFixed(2) stats.spk_rate_ldel = parseFloat((json.spk_rate_ldel * voteWeight) + (decayWeight * parseFloat(stats.spk_rate_ldel))).toFixed(6) stats.spk_rate_lgov = parseFloat((json.spk_rate_lgov * voteWeight) + (decayWeight * parseFloat(stats.spk_rate_lgov))).toFixed(6) stats.spk_rate_lpow = parseFloat((json.spk_rate_lpow * voteWeight) + (decayWeight * parseFloat(stats.spk_rate_lpow))).toFixed(6) stats.max_coll_members = (json.max_coll_members * voteWeight) + (decayWeight * parseFloat(stats.max_coll_members)) < 25 ? 25 : ((json.max_coll_members * voteWeight) + (decayWeight * stats.max_coll_members)> 79 ? 79 : parseFloat((json.max_coll_members * voteWeight) + (decayWeight * stats.max_coll_members)).toFixed(6)) //useful-votes-calc if(!aValidator)stats.power_voted[stats.lastIBlock] = effective_power + (typeof stats.power_voted[stats.lastIBlock] == \"number\" ? stats.power_voted[stats.lastIBlock] : 0) ops.push({ type: \"put\", path: [\"stats\"], data: stats, }); ops.push({ type: \"put\", path: [\"spkVote\", from], data: thisVote, }); ops.push({ type: \"put\", path: [\"feed\", `${json.block_num}:${json.transaction_id}`], data: `@${from}| Has updated their votes.`, }); store.batch(ops, pc); } }); } else { ops.push({ type: \"put\", path: [\"feed\", `${json.block_num}:${json.transaction_id}`], data: `@${from}| Attempted SPK vote with posting key`, }); store.batch(ops, pc); } }\n\nThank you for participating in our development process and participating in these early stages.\n\nVote for our Witness:", "articleBodyHtml": "
\n\n
\n\n

\u25b6\ufe0f Watch on 3Speak

\n\n
\"coinvotingc.png\"
\n\n

We're buttoning up the next iteration of the SPK Network. Before we get there, we're discussing our new voting mechanism and hoping to get your feedback. This meeting is approaching an hour long, and below is all the code referenced directly in the video... with some additional //comments.

\n\n
exports.spk_vote = (json, from, active, pc) => { //the contract json:payload, from:@hiveaccount, active: active key used, pc:promise chain(for contract ordering)\n  var ops = [] //holds raw memory instructions\n  if (active) { //ensures active key\n// memory reads\n    var powp = getPathNum([\"spow\", from]), // @from's powered spk\n      tpowp = getPathNum([\"spow\", \"t\"]), // the total powered spk\n      dpowp = getPathObj([\"spowd\", from]), // @from's downpower operations pointer\n      votebp = getPathObj(['spkVote', from]), // @from's last vote information\n      pstats = getPathNum(['stats']) // current network parameters\n  Promise.all([powp, tpowp, dpowp, votebp, pstats]).then((mem) => {\n    var stats = mem[4]\n    const DAOString = mem[3].substring(mem[3].indexOf(\",\")),\n      lastVote = Base64.toNumber(mem[3].split(\",\")[0])\n        ? Base64.toNumber(mem[3].split(\",\")[0])\n        : json.block_num - parseInt(stats.spk_cycle_length),\n      thisVote =\n        Base64.fromNumber(json.block_num) + \",\" + (DAOString ? DAOString : \"\"), //decoding terse memory\n      ago = json.block_num - lastVote\n      total = mem[1],\n      power = mem[0]\n      downs = Object.keys(mem[2])\n      var effective_power = power, effective_total, aValidator = false\n      if(stats.validators?.[from]){ //determine if @from is a validator\n        aValidator = true\n        var powerVoted = 0\n        for (block of stats.power_voted){\n          powerVoted += stats.power_voted[block]\n        }\n        power = (total - powerVoted)/20 //or number of validators\n      }\n      if (!power){\n        ops.push({\n          type: \"put\",\n          path: [\"feed\", `${json.block_num}:${json.transaction_id}`],\n          data: `@${from}| Attempted SPK vote with no voting power`,\n        });\n        store.batch(ops, pc);\n      } else if(downs.length && !aValidator){\n        getPathObj(['chrono', downs[0]]).then(down =>{ // additional step to recover downpower information from pointer\n          finish(down)\n        })\n      } else {\n        finish()\n      }\n      function finish(down_obj) {\n        if(down_obj?.amount){\n          effective_power = power - down_obj.amount\n        }\n        if (ago < parseFloat(stats.spk_cycle_length))effective_power = parseInt(effective_power * (ago / stats.spk_cycle_length))\n        else if (ago > parseFloat(stats.spk_cycle_length) && ago < stats.spk_cycle_length * 2)effective_power = effective_power* parseInt(\n          effective_power *\n            (1 - ((ago - stats.spk_cycle_length) / stats.spk_cycle_length) / 2)\n        )\n        else if (ago >= stats.spk_cycle_length * 2)effective_power = parseInt(effective_power/2)\n        effective_total = effective_total - effective_power\n        const voteWeight = parseFloat(effective_power/effective_total).toFixed(8)\n        const decayWeight = parseFloat(\n          (effective_total - effective_power) / effective_total\n        ).toFixed(8);\n        //verify inputs, adjust constants\n        if(json.spk_cycle_length < 28800)json.spk_cycle_length = 28800\n        if(json.spk_cycle_length > 3000000)json.spk_cycle_length = 3000000\n        if(json.dex_fee < 0)json.dex_fee = 0\n        if(json.dex_fee > 0.1)json.dex_fee = \"0.1\"\n        if(json.dex_max < 0)json.dex_max = 0\n        if(json.dex_max > 100)json.dex_max = 100\n        if(json.dex_slope < 0)json.dex_slope = 0\n        if(json.dex_slope > 100)json.dex_slope = 100\n        if(json.spk_rate_lpow < 0)json.spk_rate_lpow = 0\n        if(json.spk_rate_lpow > stats.spk_rate_ldel)json.spk_rate_lpow = stats.spk_rate_ldel\n        if(json.spk_rate_ldel > stats.spk_rate_lgov)json.spk_rate_lpow = stats.spk_rate_lgov\n        if(json.spk_rate_ldel < stats.spk_rate_lpow)json.spk_rate_ldel = stats.spk_rate_lpow\n        if(json.spk_rate_lgov > 0.1)json.spk_rate_lgov = \"0.1\"\n        if(json.spk_rate_lgov < stats.spk_rate_ldel)json.spk_rate_lpow = stats.spk_rate_ldel\n        if(json.max_coll_members > 100)json.max_coll_members = 100\n        if(json.max_coll_members < 15)json.max_coll_members = 15\n        json.max_coll_members = parseInt(json.max_coll_members)\n        //stats.item = (json.vote * voteWeight) + (decayWeight * stats.item)\n        stats.spk_cycle_length = (json.spk_cycle_length * voteWeight) + (decayWeight * parseFloat(stats.spk_cycle_length)) > 28800 ? parseFloat((json.spk_cycle_length * voteWeight) + (decayWeight * stats.spk_cycle_length).toFixed(6)) : 28800\n        stats.dex_fee = parseFloat((json.dex_fee * voteWeight) + (decayWeight * parseFloat(stats.dex_fee))).toFixed(6)\n        stats.dex_max = parseFloat((json.dex_max * voteWeight) + (decayWeight * parseFloat(stats.dex_max))).toFixed(2)\n        stats.dex_slope = parseFloat((json.dex_slope * voteWeight) + (decayWeight * parseFloat(stats.dex_slope))).toFixed(2)\n        stats.spk_rate_ldel = parseFloat((json.spk_rate_ldel * voteWeight) + (decayWeight * parseFloat(stats.spk_rate_ldel))).toFixed(6)\n        stats.spk_rate_lgov = parseFloat((json.spk_rate_lgov * voteWeight) + (decayWeight * parseFloat(stats.spk_rate_lgov))).toFixed(6)\n        stats.spk_rate_lpow = parseFloat((json.spk_rate_lpow * voteWeight) + (decayWeight * parseFloat(stats.spk_rate_lpow))).toFixed(6)\n        stats.max_coll_members = (json.max_coll_members * voteWeight) + (decayWeight * parseFloat(stats.max_coll_members)) < 25 ? 25 : ((json.max_coll_members * voteWeight) + (decayWeight * stats.max_coll_members) > 79 ? 79 : parseFloat((json.max_coll_members * voteWeight) + (decayWeight * stats.max_coll_members)).toFixed(6))\n        //useful-votes-calc\n        if(!aValidator)stats.power_voted[stats.lastIBlock] = effective_power + (typeof stats.power_voted[stats.lastIBlock] == \"number\" ? stats.power_voted[stats.lastIBlock] : 0)\n        ops.push({\n          type: \"put\",\n          path: [\"stats\"],\n          data: stats,\n        });\n        ops.push({\n          type: \"put\",\n          path: [\"spkVote\", from],\n          data: thisVote,\n        });\n        ops.push({\n          type: \"put\",\n          path: [\"feed\", `${json.block_num}:${json.transaction_id}`],\n          data: `@${from}| Has updated their votes.`,\n        });\n        store.batch(ops, pc);\n      }\n  });\n  } else {\n    ops.push({\n      type: \"put\",\n      path: [\"feed\", `${json.block_num}:${json.transaction_id}`],\n      data: `@${from}| Attempted SPK vote with posting key`,\n    });\n    store.batch(ops, pc);\n  }\n}\n\n
\n\n

Thank you for participating in our development process and participating in these early stages.

\n\n

Vote for our Witness:

\n\n
", "canonicalUrl": "https://hive.blog/hive-112019/@spknetwork/pzblscua"},{"url": "https://hive.blog/cafiene/@disregardfiat/supermarket", "probability": 0.5650552, "headline": "Supermarket", "datePublished": "2022-11-24T04:23:52.301300", "datePublishedRaw": "5 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/Eos2L4QESaC25AGHxFEmjnM7GLhuLCWEi67imDAaLzkx7Wr8Gtw7fVcVhbRssnosrrN.jpg", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/Eos2L4QESaC25AGHxFEmjnM7GLhuLCWEi67imDAaLzkx7Wr8Gtw7fVcVhbRssnosrrN.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/Eoc8UxnL4VzpXEo8JbpCYRvthajezvTcNfvvy6dufkP6acpM6HYMzyAi7pxe1T7zSjf.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23vsSyk6rrR4m9C7Xfm3QAPYQrwUcA66h2hwGaAyDKW885Tu5PdwQ9Jm5d7i7oAMKcyP5.jpg"], "description": "Terer\u00e9 16th of Novemblog I gotta punt this one. Enjoy pictures of the massive selection dedicated to Terer\u00e9 at my local market. It's the national drink for a reason and I\u2026 by disregardfiat", "articleBody": "Terer\u00e9\n\n16th of Novemblog\n\nI gotta punt this one. Enjoy pictures of the massive selection dedicated to Terer\u00e9 at my local market. It's the national drink for a reason and I don't feel like I can speak to it as well as others can.", "articleBodyHtml": "
\n\n

Terer\u00e9

\n\n

16th of Novemblog

\n\n

I gotta punt this one. Enjoy pictures of the massive selection dedicated to Terer\u00e9 at my local market. It's the national drink for a reason and I don't feel like I can speak to it as well as others can.
\n\"The

\n\n
\"Simple
\n\n
\"Herbs
\n\n
", "canonicalUrl": "https://peakd.com/cafiene/@disregardfiat/supermarket"},{"url": "https://hive.blog/dev/@disregardfiat/oceans-0x", "probability": 0.9376172, "headline": "Oceans 0x", "datePublished": "2022-11-24T04:23:56.654019", "datePublishedRaw": "5 months ago", "author": "disregardfiat", "authorsList": ["disregardfiat"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23viRKQFm6g14c6tVexrUeEXV7oGfdRFMSiPHkrE1AxxVccCBPjLyAm2c272WGHCBKif3.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23tGTmQCv3v3SHeQ3n6rHMosrcpt4LKX7a4fDZZzWbowSyRCZQyGLcm31kULsWPc9aWkC.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23viRKQFm6g14c6tVexrUeEXV7oGfdRFMSiPHkrE1AxxVccCBPjLyAm2c272WGHCBKif3.png"], "description": "Let's plan a robbery of a HoneyComb multi-sig account. by disregardfiat", "articleBody": "Organizing a Heist\n\n15th of Novemblog\n\nToday I'm going to show you exactly how to steal community funds from HoneyComb. It's a bit of a technical process, but where there is a will there is a way.\n\nFirst things first, you're going to need a fair amount of capital. Robbing a casino required them(in the Oceans 11 movie) to buy a swat truck, building a fake vault, losing enough money to be comped a high roller suite, flying in 11 people, not to mention skills that are quite hard to put an actual cost on like a contortionist who is small and skilled enough for the job. At the beginning of the day... the only reason this heist took place was the blank check Reuben gave the team and his motive was vengeance; no sane person or group of people would invest millions of dollars on such a long shot with even further risks to their freedom.\n\nBy the end of this you'll understand that the motives for an attack on our network will have to be similarly motivated, as the capital required will exceed the expected returns. For instance, a state actor who has more to lose with competing governance technology.\n\nSo You Have Money To Loose...\n\nFirst things first. Let's buy some tokens in the ecosystem you are attacking.\n\n@spk-cc currently has 5,014.807 HIVE & 18.415 HBD. Roughly $1500 USD. In it's current state accounts controlled by @theycallmedan, @blocktrades and @ocd hold the keys, and you'll need to place more locked capital in the ecosystem than them. DLUX already runs a more robust version of this with upto 13 key holders, and SPK Network v1.2 could have as many as 40 depending on community votes. There are a few other ways to trick these key holders into signing a fraudulent transaction... but you'll need to acquire at least 1 of these key holder spots... AND the majority of all the other accounts including the key holders.\n\nThese other methods are due to the fast replay nature of this blockchain, the network consensus is upon hashed sets of database instructions to build the decentralized state. Inserting fraudulent instructions would need to come from a much higher number of the accounts than the simple majority of key holders. This attack also couldn't happen all at once, as to be \"elected\" into the group that can submit hashes for consideration already placed members would need to drop out. Hopefully enough time to allow the network to be aware of random name accounts trying to join the network with out showing up in our chat room.\n\nThe same is true for accounts that wanted to acquire controlling level stake all at once... but depending on circumstances it would be possible for 2 accounts to do this, but it would take at least 90% of a day long period to be considered active and honest enough to hold the keys.\n\nSo to elbow your way to either eventuality above you'd need either ~2.1 million tokens locked between two accounts, or 2.2 millions token arranged precisely between 13 accounts.\n\nHow much would that cost historically?\n\nGoing back a couple of days it seems the exchange volume is just north of 1000 Hive per day. You would need to purchase this amount of tokens slowly, because there aren't even close to 2.1M tokens available for sale. For security reasons like this, open orders expires after a maximum of 30 days, so negligent sellers won't accumulate an attack vectors worth of tokens to instantly buy. The same is true on the buy side, but mostly because nearly every difficult security decision and cost in this paradigm is to protect the account with the open orders. Maintaining them are the majority of costs in the infrastructure.\n\nSo assuming the price stays at .01 LARYNX per HIVE... you'll need to spend ~21,000 HIVE or ~22,000 HIVE. This will take roughly 3 weeks. IF token in this amount become available. Chances are the market value will jump and the attack will take longer or the cost will go up.\n\nWhile you are accumulating these tokens you might as well participate in the network. DEX fees are distributed equally to accounts, so there is no real incentive to put all your funds in one account. blocktrades, tcmd and ocd each make the same on dex fees as @mannimanccadm with as little as 30 times less staked. With 25 accounts, and up to 13 accounts needed to achieve this \"hack\" you can pull in about 25% of the dex fees over your buy in period. Which will net you up to 100 HIVEs worth of LARYNX.\n\nYou could easily currently sell many times less Larynx to buy the entire order book. The margins may grow or shrink... but they will never get near enough to make this kind of confidence hack worth it.\n\nWorth It?", "articleBodyHtml": "
\n\n

Organizing a Heist

\n\n

15th of Novemblog

\n\n

Today I'm going to show you exactly how to steal community funds from HoneyComb. It's a bit of a technical process, but where there is a will there is a way.

\n\n

First things first, you're going to need a fair amount of capital. Robbing a casino required them(in the Oceans 11 movie) to buy a swat truck, building a fake vault, losing enough money to be comped a high roller suite, flying in 11 people, not to mention skills that are quite hard to put an actual cost on like a contortionist who is small and skilled enough for the job. At the beginning of the day... the only reason this heist took place was the blank check Reuben gave the team and his motive was vengeance; no sane person or group of people would invest millions of dollars on such a long shot with even further risks to their freedom.

\n\n

By the end of this you'll understand that the motives for an attack on our network will have to be similarly motivated, as the capital required will exceed the expected returns. For instance, a state actor who has more to lose with competing governance technology.

\n\n

So You Have Money To Loose...

\n\n

First things first. Let's buy some tokens in the ecosystem you are attacking.

\n\n

@spk-cc currently has 5,014.807 HIVE & 18.415 HBD. Roughly $1500 USD. In it's current state accounts controlled by @theycallmedan, @blocktrades and @ocd hold the keys, and you'll need to place more locked capital in the ecosystem than them. DLUX already runs a more robust version of this with upto 13 key holders, and SPK Network v1.2 could have as many as 40 depending on community votes. There are a few other ways to trick these key holders into signing a fraudulent transaction... but you'll need to acquire at least 1 of these key holder spots... AND the majority of all the other accounts including the key holders.

\n\n

These other methods are due to the fast replay nature of this blockchain, the network consensus is upon hashed sets of database instructions to build the decentralized state. Inserting fraudulent instructions would need to come from a much higher number of the accounts than the simple majority of key holders. This attack also couldn't happen all at once, as to be \"elected\" into the group that can submit hashes for consideration already placed members would need to drop out. Hopefully enough time to allow the network to be aware of random name accounts trying to join the network with out showing up in our chat room.

\n\n

The same is true for accounts that wanted to acquire controlling level stake all at once... but depending on circumstances it would be possible for 2 accounts to do this, but it would take at least 90% of a day long period to be considered active and honest enough to hold the keys.

\n\n
\"The
\n\n

So to elbow your way to either eventuality above you'd need either ~2.1 million tokens locked between two accounts, or 2.2 millions token arranged precisely between 13 accounts.

\n\n

How much would that cost historically?

\n\n
\"Price
\n\n

Going back a couple of days it seems the exchange volume is just north of 1000 Hive per day. You would need to purchase this amount of tokens slowly, because there aren't even close to 2.1M tokens available for sale. For security reasons like this, open orders expires after a maximum of 30 days, so negligent sellers won't accumulate an attack vectors worth of tokens to instantly buy. The same is true on the buy side, but mostly because nearly every difficult security decision and cost in this paradigm is to protect the account with the open orders. Maintaining them are the majority of costs in the infrastructure.

\n\n

So assuming the price stays at .01 LARYNX per HIVE... you'll need to spend ~21,000 HIVE or ~22,000 HIVE. This will take roughly 3 weeks. IF token in this amount become available. Chances are the market value will jump and the attack will take longer or the cost will go up.

\n\n

While you are accumulating these tokens you might as well participate in the network. DEX fees are distributed equally to accounts, so there is no real incentive to put all your funds in one account. blocktrades, tcmd and ocd each make the same on dex fees as @mannimanccadm with as little as 30 times less staked. With 25 accounts, and up to 13 accounts needed to achieve this \"hack\" you can pull in about 25% of the dex fees over your buy in period. Which will net you up to 100 HIVEs worth of LARYNX.

\n\n

You could easily currently sell many times less Larynx to buy the entire order book. The margins may grow or shrink... but they will never get near enough to make this kind of confidence hack worth it.

\n\n

Worth It?

\n\n
", "canonicalUrl": "https://peakd.com/dev/@disregardfiat/oceans-0x"},{"url": "https://hive.blog/witness/@disregardfiat/witness-update-rliial", "probability": 0.8479714, "headline": "Witness Update", "datePublished": "2022-11-24T04:23:58.310474", "datePublishedRaw": "5 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23tw7xakyBCuvzqbEekw3eYJztcMtVn42DCWDCZ8FxQv8x1Dsosx6F79Ki4Y3R2vVkgzc.png", "images": ["https://images.hive.blog/DQmZqRvBWLUxHQvMCqG8htzgES6U7HYpae54bw7LK18YTdw/image.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23tw7xakyBCuvzqbEekw3eYJztcMtVn42DCWDCZ8FxQv8x1Dsosx6F79Ki4Y3R2vVkgzc.png"], "description": "Short Witness Update by disregardfiat", "articleBody": "Witness\n\n17th of Novemblog\n\nIt's that time again. Where I brag about my witness and the work it's doing. Since the last update I've missed 0 additional blocks. Ran 2 hard forks (like everybody else). During HF 26 I was one of the highest paid back up witnesses... hurray for whatever luck gave me that honor.\n\nI'm currently Rank 42(Rank 41 Active) and produce a block roughly every 70 minutes.\n\nI have 2 witnesses, one of course is signing and the other one has all the plugins enabled and is serving everything except hivemind. You can find this API at hive-api.dlux.io and has some rate limiting and caching set up to handle blockstreams and block range calls that honeycomb network uses most frequently.\n\nI've been doing some research for SPK network that spills over into this area and we just may set up this server to be a little most robust.\n\nI have to say, after getting things set up right it's been a mostly enjoyable experience. The price feed software I run has really been the only pain.", "articleBodyHtml": "
\n\n

Witness

\n\n

17th of Novemblog

\n\n

It's that time again. Where I brag about my witness and the work it's doing. Since the last update I've missed 0 additional blocks. Ran 2 hard forks (like everybody else). During HF 26 I was one of the highest paid back up witnesses... hurray for whatever luck gave me that honor.

\n\n
\n\n

I'm currently Rank 42(Rank 41 Active) and produce a block roughly every 70 minutes.

\n\n

I have 2 witnesses, one of course is signing and the other one has all the plugins enabled and is serving everything except hivemind. You can find this API at hive-api.dlux.io and has some rate limiting and caching set up to handle blockstreams and block range calls that honeycomb network uses most frequently.

\n\n

I've been doing some research for SPK network that spills over into this area and we just may set up this server to be a little most robust.

\n\n

I have to say, after getting things set up right it's been a mostly enjoyable experience. The price feed software I run has really been the only pain.

\n\n
", "canonicalUrl": "https://peakd.com/witness/@disregardfiat/witness-update-rliial"},{"url": "https://hive.blog/dev/@disregardfiat/honeycomb-incentives-and-attacks", "probability": 0.90482897, "headline": "HoneyComb - Incentives and Attacks", "datePublished": "2022-11-24T04:23:58.540850", "datePublishedRaw": "5 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23tGVdSYei1xNhbXosFdfEMur4PuE4CGqVnEdyC1UCGQ6NAriaEChrBBtbsqF3AYqq8zq.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23tHbTumEN1e9CDarUQuRM9yhTJDg2qzoFjf3hizMcj1ovqVzQWqBepVWB2ywYdCj1MxU.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23tGVdSYei1xNhbXosFdfEMur4PuE4CGqVnEdyC1UCGQ6NAriaEChrBBtbsqF3AYqq8zq.png"], "description": "What actually keeps a DEX honest? by disregardfiat", "articleBody": "HoneyComb Vectors\n\n14th of Novemblog\n\nI realize that I've spent thousands of hours in thought, hundreds of hours in code, and tens of hours trying to describe and perfect cross chain security. I've probably spent the most time with @starkerz and he keeps asking me to put into writing what I think about security. So today's the day.\n\nThink Like an Engineer\n\nIf you want real world security there is no perfect solution. It doesn't matter how big your lock is there is always more dynamite/nuclear warheads/asteroids/solar expansion/black holes out there. Engineers therefore have to think in terms of cost-benefit analysis. The chance of an asteroid strike is tiny compared to the cost of mitigating it to preserve my Brownie the Bear (Beanie Baby \u2013 $20,000). Would you spend more for a bike lock than a bike?\n\nLikewise, if I want to steal Brownie the Bear, it should cost me less than $20,000.\n\nGains\n\nThe motives of an attack aren't always financial. For instance, the bombing of the Georgia Guidestones only cost resources to carry out and carried a risk of 20 years in prison... but destruction seemed to be the only motivation. Likewise, in a cryptocurrency ecosystem the goals of an attack could be as varied as breaking trust in an ecosystem, writing an academic paper, or just because \"they could\".\n\nLosses\n\nWe can minimize most potential losses in our ecosystem through \"fork protection.\" Any of our valuable data, internal assets, can once again be collectively decided to be the accurate copy for nearly any circumstance. This already happens quite often on most chains, Hive included. It's just the differing opinion of truth is usually 1 account. This goes back to an old argument: Code is Law... well I guess it's like a legal law: One that can be interpreted and changed.\n\nHowever, there is a world of difference in internal and external assets. Oddly enough these two very different things are identical in their own contexts. Coins on Hive (Hive, HBD, Vests/HP) are internal... on Hive, and external in any other context. That includes when you own them, when the are wrapped on a different chain, or when they are in an exchange account. Just the same, SPK, LARYNX, DLUX, DUAT are internal assets in their chains... and external assets when you own them, they are wrapped, or they are in an exchange account.\n\nThis is a little different than bearer instruments like cash. As long as you have the cash it's a quasi internal asset. That's also what makes cash an easy thing to steal... once it's not in your possession it's not your asset. In cryptocurrencies you have to \"convince the world\" that you can change the state of your data. Through cryptographic transaction signing.\n\nIn HoneyComb most data is an internal asset, and has fork protection. As long as there is a group of people who can agree what is valid... then it's impossible to change that through any brute force means, social engineering and politicing aside.\n\nHere Comes the BUT\n\nThe people interested in HoneyComb data will always be a larger set of people than those that manage HoneyComb data. The same is true for bitcoin, and ETH, and every financial instrument ever conceived. Try giving a newborn a $2 bill, savings bond, a college fund, or a bitcoin to understand this point. The cost of goods will fluctuate, interest rates(the price of money) will fluctuate, college tutions will change in vale... and bitcoin sure isn't going to be the bastion of stability, and the child never had any impact on this at all.\n\nSo depending on the system, a small number of people can have a larger than their weights outcome on the system. Bitcoin is really the first financial scheme to protect itself from tampering by making honest participation more valuable than dishonest participation. This doesn't at all mean bitcoins as an external asset are immune to the same kinds of issues as any other asset (Stares at FTX administrators) but internally we have collectively probed and found no chinks in it's protocol.\n\nBitcoin's cost to secure comes with quite the pricetag though.\n\nThis is a wild chart, and one thing it doesn't speak to in terms of scaling, is that it's also impossible to scale PoW systems in a parallel way. As a 51% attack on any competing infrastructure would only take the smallest fraction of the bitcoin network. You'd likely never know that a container full of mining rigs somewhere got pointed towards inserting a malicious transaction on a bitcoin-esque clone.\n\nWhich means, we are barred from PoW systems both because they can't scale the number of transactions they do to something broadly interactive and because any parallel system will suffer from wildly varying incentives on where to use their PoW hashes.\n\nHive uses a Delegated Proof of Stake (DPoS) system to try and give everybody a say in who represents their interests (Vote for my witness \ud83d\ude1c ) in keeping the internal data in a pristine state.\n\nHoneyComb uses a Proof of Stake (PoS) system to manage external assets.\n\nSnap Back to Reality\n\nWhen conceptualizing this system my internal model was how would I send a valuable item via a third party? If I wanted to mail Brownie the Bear to a buyer. What would need to happen to make sure that no matter what happened, there would be no party to the transaction that lost value? Researching this lead me to the only known conclusion: collateral.\n\nSo my courier would need to be bonded for at least $20,000 before I'd be willing to part with Brownie... if my courier takes Brownie, all he did was purchase it. My recipient would be a little miffed I'm sure, but at least he'd have his $20,000 back.\n\nWhat can be an incredibly complex system in the real world is only a few thousand lines of code in HoneyComb. Anybody who wants to run a node, can run a node. They are pretty cheap to run... less than $5/month. If you want a share of the DEX fees, you can lock some tokens as collateral. If you've locked enough to have a positive impact on ecosystem(and are one of the top 25-79 accounts that do so depending on the communities votes) then you'll receive an equal share of the DEX fees. If you've locked more than most of the nodes in that group you'll be assigned as a key holder for external assets; and you're collateral will now help determine the safety limit on the open orders carried by the network.\n\nSo let's say there are 25 key holders, simple majority controls the funds and the value of the poorest 13 of those key holders is the maximum steady state wallet size. We can't completely control inflows to this account, but we can near instantly refund or manage orders over this size. Especially with One Block Irreversibility there can be as few as 3 seconds before this under collateralization can be safely managed. Outside of this situation, any combination of the 13 required keyholders to steal funds will result in the community removing their collateral.\n\nIdeally, when the chain recovers the new multi-sig DEX account will be funded (open orders replaced) by the honest community members who will split the collateral forfeit by the dishonest(cheaper tokens). Alternatively, the collateral will be used to fill the open orders, though there could be an imbalance here if extremely pessimistic orders were open (0.001 hive for 100000000 DLUX for instance)... A mixture of the two could preserve token value the best.\n\nWith this paradigm we are ensuring it would be more beneficial to just use the DEX to trade your tokens. The astute among you may point out that time locks on collateral actually prevent this assurance because the time value of assets is an unknown... and well, you're right. No system is prefect... and HoneyComb will strive to be better at managing DEX transaction, and can with the help of everybody who's voted for our funding proposals (SPK and HoneyComb alike).\n\nConclusion\n\nThat's really it. Attacks here can only alter fork protected internal data... or the managed collateralized external funds on Hive. There should be no valuable attack possible. And thus far coding errors have only lost peanuts that have been returned when asked. (See yesterday's blog for an example)", "articleBodyHtml": "
\n\n

HoneyComb Vectors

\n\n

14th of Novemblog
\n\"3Speak

\n\n

I realize that I've spent thousands of hours in thought, hundreds of hours in code, and tens of hours trying to describe and perfect cross chain security. I've probably spent the most time with @starkerz and he keeps asking me to put into writing what I think about security. So today's the day.

\n\n

Think Like an Engineer

\n\n

If you want real world security there is no perfect solution. It doesn't matter how big your lock is there is always more dynamite/nuclear warheads/asteroids/solar expansion/black holes out there. Engineers therefore have to think in terms of cost-benefit analysis. The chance of an asteroid strike is tiny compared to the cost of mitigating it to preserve my Brownie the Bear (Beanie Baby \u2013 $20,000). Would you spend more for a bike lock than a bike?

\n\n

Likewise, if I want to steal Brownie the Bear, it should cost me less than $20,000.

\n\n

Gains

\n\n

The motives of an attack aren't always financial. For instance, the bombing of the Georgia Guidestones only cost resources to carry out and carried a risk of 20 years in prison... but destruction seemed to be the only motivation. Likewise, in a cryptocurrency ecosystem the goals of an attack could be as varied as breaking trust in an ecosystem, writing an academic paper, or just because \"they could\".

\n\n

Losses

\n\n

We can minimize most potential losses in our ecosystem through \"fork protection.\" Any of our valuable data, internal assets, can once again be collectively decided to be the accurate copy for nearly any circumstance. This already happens quite often on most chains, Hive included. It's just the differing opinion of truth is usually 1 account. This goes back to an old argument: Code is Law... well I guess it's like a legal law: One that can be interpreted and changed.

\n\n

However, there is a world of difference in internal and external assets. Oddly enough these two very different things are identical in their own contexts. Coins on Hive (Hive, HBD, Vests/HP) are internal... on Hive, and external in any other context. That includes when you own them, when the are wrapped on a different chain, or when they are in an exchange account. Just the same, SPK, LARYNX, DLUX, DUAT are internal assets in their chains... and external assets when you own them, they are wrapped, or they are in an exchange account.

\n\n

This is a little different than bearer instruments like cash. As long as you have the cash it's a quasi internal asset. That's also what makes cash an easy thing to steal... once it's not in your possession it's not your asset. In cryptocurrencies you have to \"convince the world\" that you can change the state of your data. Through cryptographic transaction signing.

\n\n

In HoneyComb most data is an internal asset, and has fork protection. As long as there is a group of people who can agree what is valid... then it's impossible to change that through any brute force means, social engineering and politicing aside.

\n\n

Here Comes the BUT

\n\n

The people interested in HoneyComb data will always be a larger set of people than those that manage HoneyComb data. The same is true for bitcoin, and ETH, and every financial instrument ever conceived. Try giving a newborn a $2 bill, savings bond, a college fund, or a bitcoin to understand this point. The cost of goods will fluctuate, interest rates(the price of money) will fluctuate, college tutions will change in vale... and bitcoin sure isn't going to be the bastion of stability, and the child never had any impact on this at all.

\n\n

So depending on the system, a small number of people can have a larger than their weights outcome on the system. Bitcoin is really the first financial scheme to protect itself from tampering by making honest participation more valuable than dishonest participation. This doesn't at all mean bitcoins as an external asset are immune to the same kinds of issues as any other asset (Stares at FTX administrators) but internally we have collectively probed and found no chinks in it's protocol.

\n\n

Bitcoin's cost to secure comes with quite the pricetag though.

\n\n
\"Credit
\n\n

This is a wild chart, and one thing it doesn't speak to in terms of scaling, is that it's also impossible to scale PoW systems in a parallel way. As a 51% attack on any competing infrastructure would only take the smallest fraction of the bitcoin network. You'd likely never know that a container full of mining rigs somewhere got pointed towards inserting a malicious transaction on a bitcoin-esque clone.

\n\n

Which means, we are barred from PoW systems both because they can't scale the number of transactions they do to something broadly interactive and because any parallel system will suffer from wildly varying incentives on where to use their PoW hashes.

\n\n

Hive uses a Delegated Proof of Stake (DPoS) system to try and give everybody a say in who represents their interests (Vote for my witness \ud83d\ude1c ) in keeping the internal data in a pristine state.

\n\n

HoneyComb uses a Proof of Stake (PoS) system to manage external assets.

\n\n

Snap Back to Reality

\n\n

When conceptualizing this system my internal model was how would I send a valuable item via a third party? If I wanted to mail Brownie the Bear to a buyer. What would need to happen to make sure that no matter what happened, there would be no party to the transaction that lost value? Researching this lead me to the only known conclusion: collateral.

\n\n

So my courier would need to be bonded for at least $20,000 before I'd be willing to part with Brownie... if my courier takes Brownie, all he did was purchase it. My recipient would be a little miffed I'm sure, but at least he'd have his $20,000 back.

\n\n

What can be an incredibly complex system in the real world is only a few thousand lines of code in HoneyComb. Anybody who wants to run a node, can run a node. They are pretty cheap to run... less than $5/month. If you want a share of the DEX fees, you can lock some tokens as collateral. If you've locked enough to have a positive impact on ecosystem(and are one of the top 25-79 accounts that do so depending on the communities votes) then you'll receive an equal share of the DEX fees. If you've locked more than most of the nodes in that group you'll be assigned as a key holder for external assets; and you're collateral will now help determine the safety limit on the open orders carried by the network.

\n\n

So let's say there are 25 key holders, simple majority controls the funds and the value of the poorest 13 of those key holders is the maximum steady state wallet size. We can't completely control inflows to this account, but we can near instantly refund or manage orders over this size. Especially with One Block Irreversibility there can be as few as 3 seconds before this under collateralization can be safely managed. Outside of this situation, any combination of the 13 required keyholders to steal funds will result in the community removing their collateral.

\n\n

Ideally, when the chain recovers the new multi-sig DEX account will be funded (open orders replaced) by the honest community members who will split the collateral forfeit by the dishonest(cheaper tokens). Alternatively, the collateral will be used to fill the open orders, though there could be an imbalance here if extremely pessimistic orders were open (0.001 hive for 100000000 DLUX for instance)... A mixture of the two could preserve token value the best.

\n\n

With this paradigm we are ensuring it would be more beneficial to just use the DEX to trade your tokens. The astute among you may point out that time locks on collateral actually prevent this assurance because the time value of assets is an unknown... and well, you're right. No system is prefect... and HoneyComb will strive to be better at managing DEX transaction, and can with the help of everybody who's voted for our funding proposals (SPK and HoneyComb alike).

\n\n

Conclusion

\n\n

That's really it. Attacks here can only alter fork protected internal data... or the managed collateralized external funds on Hive. There should be no valuable attack possible. And thus far coding errors have only lost peanuts that have been returned when asked. (See yesterday's blog for an example)

\n\n
", "canonicalUrl": "https://peakd.com/dev/@disregardfiat/honeycomb-incentives-and-attacks"},{"url": "https://hive.blog/dev/@disregardfiat/honeycomb-defi-where-community-matters", "probability": 0.97585416, "headline": "HoneyComb - DeFi Where Community Matters", "datePublished": "2022-11-24T04:23:59.725777", "datePublishedRaw": "5 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23wgguBGuMECLR2F93XZLFjSWZZ398SAPJ9YNCkGCzKapTsFdoHGVvW1N28DJM5JyNRLX.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23wgguBGuMECLR2F93XZLFjSWZZ398SAPJ9YNCkGCzKapTsFdoHGVvW1N28DJM5JyNRLX.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/zottone444/23t7AyKqAfdxKEJPQrpePMW15BCPhbyrf5VoHWxhBFcEcPLjDUVVQAh9ZAopbmoJDekS6.png"], "description": "Developing on Hive is a true joy. by disregardfiat", "articleBody": "Hive is Amazing\n\nDay 13 in NovemBlog\n\nProgress Updates\n\nI've been using the much smaller DLUX iteration of HoneyComb to test changes and yesterday it all paid off. Let's explain how the Multi-Signature wallet works for the DEX so I can explain the issue, and the solution.\n\nThe DEX here isn't somewhere you just \"leave\" a balance, you have an open order. It's tokens are as decentralized as a main chain... where they are still managed exclusively by your keys. However, Sending orders/Hive to our DEX account is different. This is an account that's not on our chain, it's on Hive. While it's pretty easy to manage Hive accounts from a block stream(and not at all easy to manage RCs from a block stream) it does suffer from issues of key management.\n\nTo remedy this I've built autonomous controllers for multi-signature accounts. To put it in simple terms, they build identical transactions and only pass the signatures thru hive. These transactions expire after no more than an hour, and when a node reads a signature on chain... if it's collected enough signature... it tries to broadcast the transaction.\n\nWhat happens when the expiration passes? Well, when the nodes don't get enough signatures they just take the old transaction data, and build a new transaction with the current parameters. It pulls a time stamp from a block so the nodes don't have to worry about synchronizing a time stamp... it deletes the old transaction, that was never passed around, and stores the new one... and sends off a signature.\n\nSo yesterday I was a little surprised when the DLUX chain spit out a few of the same transactions to @blockgolem. He was too, and messaged me first about how to deal with the excess funds in his account. He did a little math and sent back what he believe to be the overages, and I put them back in the DEX. (Any multi-sig keyholder can reinsert fund into the dex by sending them with the memo of 'IGNORE'). Before getting into why this hapened, let's just take a moment to consider how amazing our accounts and the reputation that is intrinsic to our community is.\n\nNo KYC No Problem\n\nThe lack of KYC is one of the biggest features here. When FTX goes down so many people have incentives to probe the systems. It looks like there was an internal hack and funds got sent to Vitalik and CZ... some other people are claiming the Democrats are laundering money thru FTX... the point is, there is a database of credentials that may get leaked intentionally, non-intentionally, or just become public record as part of a bankruptcy hearing. Identity fraud and worse then becomes an issue. In all of that KYC it didn't prevent massive fraud, potential money laundering, and theft.\n\nHere on HoneyComb, when somebody receives $25 too much they send it back. This isn't the first time, and if there are more bugs it hopefully won't be the last time. There is no database to steal, no additional complications with storing PII, and we still have honesty, reputation... and in this case some rewards.\n\nI'm keeping 1% because I'm still trying to get my Power Up Month Badge ;)\n@blockgolem also tried to decline this, but I'm hoping to set an example that others can and should follow. Good deeds don't go unpunished!\n\nBut Why The DoubleSpend?\n\nWell there are two reasons for this and one solution. When a node makes a signature they try and send the broadcast... which means, when we ran updates yesterday, there was a slight sync issue, which means 2/3 signatures required were in the block stream, and as the 3rd node was catching up, they signed and broadcasted still valid but older transactions before they deleted them internally.\n\nIs sending a broadcast when it's signed a bad thing? I was thinking so when I informed the node runner chat. But as soon as you say something out loud the real issue comes up. There were multiple valid transactions!! So an emergency fix got put in place on both chains. It reduced the expiration time on the multi-sig transactions from 1 hour to 5 minutes. This coincided with the resigning cycle on HoneyComb, and now the transactions will expire in the real world just as they expire on chain.\n\nLots of Work\n\nIt may not seem like it, but things are getting a lot better from many standpoints. As more people use the networks there have been less and less issues. Some things like block streams drop out even on my nearly unutilized public hive node. People have been telling me Hive-Engine has been suffering a bit as well.\n\nWith the recent state checks to ensure proper database reassembly, expanding multi-sig control past 3 key holders, the block-processor getting it's kinks worked out, and all the other things I've mentioned or coded... I'm feeling ever more confident in the system overall. When just 2/3 keyholders made the switch to the new change last night(because it was the middle of the night on a saturday) the chain responded well, double transactions weren't made... and when enough nodes switched to the latest version everything stabilized as intended.", "articleBodyHtml": "
\n\n

Hive is Amazing

\n\n

Day 13 in NovemBlog

\n\n

Progress Updates

\n\n

I've been using the much smaller DLUX iteration of HoneyComb to test changes and yesterday it all paid off. Let's explain how the Multi-Signature wallet works for the DEX so I can explain the issue, and the solution.

\n\n

The DEX here isn't somewhere you just \"leave\" a balance, you have an open order. It's tokens are as decentralized as a main chain... where they are still managed exclusively by your keys. However, Sending orders/Hive to our DEX account is different. This is an account that's not on our chain, it's on Hive. While it's pretty easy to manage Hive accounts from a block stream(and not at all easy to manage RCs from a block stream) it does suffer from issues of key management.

\n\n

To remedy this I've built autonomous controllers for multi-signature accounts. To put it in simple terms, they build identical transactions and only pass the signatures thru hive. These transactions expire after no more than an hour, and when a node reads a signature on chain... if it's collected enough signature... it tries to broadcast the transaction.

\n\n

What happens when the expiration passes? Well, when the nodes don't get enough signatures they just take the old transaction data, and build a new transaction with the current parameters. It pulls a time stamp from a block so the nodes don't have to worry about synchronizing a time stamp... it deletes the old transaction, that was never passed around, and stores the new one... and sends off a signature.

\n\n

So yesterday I was a little surprised when the DLUX chain spit out a few of the same transactions to @blockgolem. He was too, and messaged me first about how to deal with the excess funds in his account. He did a little math and sent back what he believe to be the overages, and I put them back in the DEX. (Any multi-sig keyholder can reinsert fund into the dex by sending them with the memo of 'IGNORE'). Before getting into why this hapened, let's just take a moment to consider how amazing our accounts and the reputation that is intrinsic to our community is.

\n\n
\n\n

No KYC No Problem

\n\n

The lack of KYC is one of the biggest features here. When FTX goes down so many people have incentives to probe the systems. It looks like there was an internal hack and funds got sent to Vitalik and CZ... some other people are claiming the Democrats are laundering money thru FTX... the point is, there is a database of credentials that may get leaked intentionally, non-intentionally, or just become public record as part of a bankruptcy hearing. Identity fraud and worse then becomes an issue. In all of that KYC it didn't prevent massive fraud, potential money laundering, and theft.

\n\n

Here on HoneyComb, when somebody receives $25 too much they send it back. This isn't the first time, and if there are more bugs it hopefully won't be the last time. There is no database to steal, no additional complications with storing PII, and we still have honesty, reputation... and in this case some rewards.

\n\n

I'm keeping 1% because I'm still trying to get my Power Up Month Badge ;)
\n@blockgolem also tried to decline this, but I'm hoping to set an example that others can and should follow. Good deeds don't go unpunished!

\n\n

But Why The DoubleSpend?

\n\n

Well there are two reasons for this and one solution. When a node makes a signature they try and send the broadcast... which means, when we ran updates yesterday, there was a slight sync issue, which means 2/3 signatures required were in the block stream, and as the 3rd node was catching up, they signed and broadcasted still valid but older transactions before they deleted them internally.

\n\n

Is sending a broadcast when it's signed a bad thing? I was thinking so when I informed the node runner chat. But as soon as you say something out loud the real issue comes up. There were multiple valid transactions!! So an emergency fix got put in place on both chains. It reduced the expiration time on the multi-sig transactions from 1 hour to 5 minutes. This coincided with the resigning cycle on HoneyComb, and now the transactions will expire in the real world just as they expire on chain.

\n\n

Lots of Work

\n\n

It may not seem like it, but things are getting a lot better from many standpoints. As more people use the networks there have been less and less issues. Some things like block streams drop out even on my nearly unutilized public hive node. People have been telling me Hive-Engine has been suffering a bit as well.

\n\n

With the recent state checks to ensure proper database reassembly, expanding multi-sig control past 3 key holders, the block-processor getting it's kinks worked out, and all the other things I've mentioned or coded... I'm feeling ever more confident in the system overall. When just 2/3 keyholders made the switch to the new change last night(because it was the middle of the night on a saturday) the chain responded well, double transactions weren't made... and when enough nodes switched to the latest version everything stabilized as intended.

\n\n
", "canonicalUrl": "https://peakd.com/dev/@disregardfiat/honeycomb-defi-where-community-matters"},{"url": "https://hive.blog/dao/@disregardfiat/dao-theory", "probability": 0.9690452, "headline": "DAO Theory", "datePublished": "2022-11-24T04:24:01.601744", "datePublishedRaw": "5 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23w2gjnrZtLgzfnPrYB99dYUwjWGt7LFkws3itRoHtupc3EvKU7pfAYdR6yCUmShadPYb.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23w2gjnrZtLgzfnPrYB99dYUwjWGt7LFkws3itRoHtupc3EvKU7pfAYdR6yCUmShadPYb.png"], "description": "Thinking about the legality of things. by disregardfiat", "articleBody": "Decentralized Autonomous Organizations\n\nBlog everyday November #11\n\nLet's just start this discussion with the biggest of disclaimers. I am not a lawyer, I've not taken the Bar, I'm not certified in any jurisdiction to give legal advice. But that doesn't mean I haven't done a little bit of research and due diligence into this arena.\n\nLegal Challenges\n\nIt seems like there is only one jurisdiction that people are scared of in this field. We have countries like India and Ecuador that have banned bitcoin... and roughly nobody seems to care; then we have the USA which doesn't issue regulations, they just take people and projects to court or outright sanction technology. What the US does have is very well defined rights and the oldest legal precedents of any constitutional republic.\n\nCongress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances. -- First Amendment to the US Constitution\n\nThis is kind of a show stopper in my view. Speech is basically anything the courts have continually ruled. Bernstein v. Department of Justice established that software is protected speech and scrapped export control of encryption algorithms... which until that time were classified as a munition. Citizens United v. Federal Election Commission opined that money is speech and corporations can exercise that right with basically unlimited campaign finance donations. Which in my unqualified view basically covers everything that cryptocurrencies are or ever will be.\n\nDAO IRL\n\nLet's imagine that software didn't exist, that encryption didn't exist... but you wanted to form a cooperative business. Cooperative businesses have some terrific income tax laws as the goal of the organization isn't profit, and earnings are distributed to it's members... a refund for underutilized membership dues. Co-Ops are in several sectors of the economy, all the way through insurance and credit unions, which means talking about a DAO in these terms isn't far from an established reality.\n\nHow can we organize our co-op with out having leaders?\n\nLet's imagine that our town had a public square or forum where you could bring your wares. A flea-market or farmers market would probably be a good example here. When you arrive you have your invoices and also a copy of the books that everybody else maintains as a condition of the co-op. You run around the market, talk with your fellows, check the final numbers and a spot check of a few random balances... knowing that any errors would compound. After everybody has had their coffee you sit in a circle and discuss your needs and resources. Let's say you need 2 woods, but have an extra bushel of wheat, everybody can record that in their ledger and at the close of the business the resources are allocated according to the agreed upon bylaws. There is an excess of a few items, the sheep all get sold at a market value and the proceeds are equitably distributed.\n\nAt this point income tax might come into play. But the entire proceedings and all of the process was just an exercise of free speech, free assembly, and depending on how the results are published (publicly)... free press.\n\nThe SEC likes to come around from time to time and wonder... is this a security? Well... here we have to wonder why the SEC can get involved at all... as citizens united establishes that money is speech, and therefore not subject to regulation. In the case of our co-op it should be abundantly clear that if Joe is sick for market on Saturday, his bricks won't make it to market, and he also won't receive any ore.\n\nDecentraliztion\n\nThere are certainly a few extra steps in the proceedings when it comes to DAOs. Software and encryption being key but protected components. The most important thing to keep in mind when wondering if you are dealing with the murky waters of securities or the protected areas of speech are if any parties future efforts are key to the success of the endeavor. When all the software is open-sourced, anybody can improve it and run it. When the key's are held at large, and anybody can hold them and if somebody losses their keys there are protections for that. We are now in a place where single points of failure don't exist... and in my unqualified view, a place where securities law doesn't apply.\n\nAs a technical person I know that geofencing is pointless... nearly every video you watch on YouTube shows you how to avoid them to watch an out of region show on Netflix. A great many things are pointless... terms and conditions are often unenforceable, how a user interacts with a DAO could be from a website or their cli-wallet. I guess this is just to say, trust absolutely no one. No provider, no client, no account, period. Only once something is signed and accepted by the community is it to be trusted. This view point will save you from a great many headaches by understanding that block chains are really good at record keeping; but using them in no way guarantees a certain outcome. Back to the co-op... even on a global scale we can't ensure sufficient food production it seems, grapes change their flavors with the conditions and some years just aren't a good vintage for wines.", "articleBodyHtml": "
\n\n

Decentralized Autonomous Organizations

\n\n

Blog everyday November #11

\n\n

Let's just start this discussion with the biggest of disclaimers. I am not a lawyer, I've not taken the Bar, I'm not certified in any jurisdiction to give legal advice. But that doesn't mean I haven't done a little bit of research and due diligence into this arena.

\n\n

Legal Challenges

\n\n

It seems like there is only one jurisdiction that people are scared of in this field. We have countries like India and Ecuador that have banned bitcoin... and roughly nobody seems to care; then we have the USA which doesn't issue regulations, they just take people and projects to court or outright sanction technology. What the US does have is very well defined rights and the oldest legal precedents of any constitutional republic.

\n\n

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances. -- First Amendment to the US Constitution

\n\n

This is kind of a show stopper in my view. Speech is basically anything the courts have continually ruled. Bernstein v. Department of Justice established that software is protected speech and scrapped export control of encryption algorithms... which until that time were classified as a munition. Citizens United v. Federal Election Commission opined that money is speech and corporations can exercise that right with basically unlimited campaign finance donations. Which in my unqualified view basically covers everything that cryptocurrencies are or ever will be.

\n\n

DAO IRL

\n\n
\"Craiyon's
\n\n

Let's imagine that software didn't exist, that encryption didn't exist... but you wanted to form a cooperative business. Cooperative businesses have some terrific income tax laws as the goal of the organization isn't profit, and earnings are distributed to it's members... a refund for underutilized membership dues. Co-Ops are in several sectors of the economy, all the way through insurance and credit unions, which means talking about a DAO in these terms isn't far from an established reality.

\n\n

How can we organize our co-op with out having leaders?

\n\n

Let's imagine that our town had a public square or forum where you could bring your wares. A flea-market or farmers market would probably be a good example here. When you arrive you have your invoices and also a copy of the books that everybody else maintains as a condition of the co-op. You run around the market, talk with your fellows, check the final numbers and a spot check of a few random balances... knowing that any errors would compound. After everybody has had their coffee you sit in a circle and discuss your needs and resources. Let's say you need 2 woods, but have an extra bushel of wheat, everybody can record that in their ledger and at the close of the business the resources are allocated according to the agreed upon bylaws. There is an excess of a few items, the sheep all get sold at a market value and the proceeds are equitably distributed.

\n\n

At this point income tax might come into play. But the entire proceedings and all of the process was just an exercise of free speech, free assembly, and depending on how the results are published (publicly)... free press.

\n\n

The SEC likes to come around from time to time and wonder... is this a security? Well... here we have to wonder why the SEC can get involved at all... as citizens united establishes that money is speech, and therefore not subject to regulation. In the case of our co-op it should be abundantly clear that if Joe is sick for market on Saturday, his bricks won't make it to market, and he also won't receive any ore.

\n\n

Decentraliztion

\n\n

There are certainly a few extra steps in the proceedings when it comes to DAOs. Software and encryption being key but protected components. The most important thing to keep in mind when wondering if you are dealing with the murky waters of securities or the protected areas of speech are if any parties future efforts are key to the success of the endeavor. When all the software is open-sourced, anybody can improve it and run it. When the key's are held at large, and anybody can hold them and if somebody losses their keys there are protections for that. We are now in a place where single points of failure don't exist... and in my unqualified view, a place where securities law doesn't apply.

\n\n

As a technical person I know that geofencing is pointless... nearly every video you watch on YouTube shows you how to avoid them to watch an out of region show on Netflix. A great many things are pointless... terms and conditions are often unenforceable, how a user interacts with a DAO could be from a website or their cli-wallet. I guess this is just to say, trust absolutely no one. No provider, no client, no account, period. Only once something is signed and accepted by the community is it to be trusted. This view point will save you from a great many headaches by understanding that block chains are really good at record keeping; but using them in no way guarantees a certain outcome. Back to the co-op... even on a global scale we can't ensure sufficient food production it seems, grapes change their flavors with the conditions and some years just aren't a good vintage for wines.

\n\n
", "canonicalUrl": "https://peakd.com/dao/@disregardfiat/dao-theory"},{"url": "https://hive.blog/dev/@disregardfiat/honeycomb-processor-update", "probability": 0.65198237, "headline": "HoneyComb - Processor Update", "datePublished": "2022-11-24T04:24:04.612096", "datePublishedRaw": "5 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23xATtwWXpurDnXkVj4M4bPPisRrMtCHWJw94KbQ5V3mYsqcyAPUHN4F6WnXpF1asBy9r.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23xATtwWXpurDnXkVj4M4bPPisRrMtCHWJw94KbQ5V3mYsqcyAPUHN4F6WnXpF1asBy9r.png"], "description": "Might as well put your head in a microwave... don't click this post with out a strong curiosity and technical prowess. by disregardfiat", "articleBody": "Brain Melting Code Review\n\nNovemblog 12\n\nHoneyComb - block processor\n\nThis is... well. A block processor. It gets blocks from an API and matches any transactions to smart contracts.\nThere is a fair amount going on here... and I guess it's best to break it into a few pieces.\n\nconst fetch = require(\"node-fetch\"); const { TXID } = require(\"./index\"); module.exports = function ( client, nextBlock = 1, prefix = \"dlux_\", account = \"null\", vOpsRequired = false ) { var onCustomJsonOperation = {}; // Stores the function to be run for each operation id. var onOperation = {}; var onNewBlock = function () {}; var onStreamingStart = function () {}; var behind = 0; var head_block; var isStreaming; var vOps = false; var stream; var blocks = { processing: 0, completed: nextBlock, stop: function () { blocks.clean(1); }, ensure: function (last) { setTimeout(() => { if (!blocks.processing && blocks.completed == last) { getBlockNumber(nextBlock); if (!(last % 3)) getHeadOrIrreversibleBlockNumber(function (result) { if (nextBlock < result - 5) { behind=result - nextBlock; beginBlockComputing(); } else if (!isStreaming) { beginBlockStreaming(); } }); } }, 1000); }, clean: function (stop=false) { var blockNums=Object.keys(blocks); for (var i=0; i < blockNums.length; i++) { if ( (parseInt(blockNums[i]) && parseInt(blockNums[i]) < nextBlock - 1) || (stop && parseInt(blockNums[i])) ) { delete blocks[blockNums[i]]; if (vOps) delete blocks[blockNums.v[i]]; } } var blockNums=Object.keys(blocks.v); for (var i=0; i < blockNums.length; i++) { if ( (parseInt(blockNums[i]) && parseInt(blockNums[i]) < nextBlock - 1) || (stop && parseInt(blockNums[i])) ) { delete blocks.v[blockNums[i]]; } } }, v: {}, requests: { last_range: 0, last_block: 0, }, manage: function (block_num, vOp=false) { if (!head_block || block_num> head_block || !(block_num % 100)) getHeadOrIrreversibleBlockNumber(function (result) { head_block = result; behind = result - nextBlock; }); if ( !(block_num % 100) && head_block > blocks.requests.last_range + 200 && Object.keys(blocks).length < 1000 ) { gbr(blocks.requests.last_range + 1, 100, 0); } if ( !(block_num % 100) && head_block - blocks.requests.last_range + 1100 ) { gbr(blocks.requests.last_range + 1, 100, 0); } if (!(block_num % 100)) blocks.clean(); if (blocks.processing) { setTimeout(()=> { blocks.manage(block_num); }, 100); blocks.clean(); } else if (vOps && !blocks.v[block_num]) return; else if (vOp && !blocks[block_num]) return; else if (blocks[block_num] && block_num == nextBlock) { blocks.processing = nextBlock; processBlock(blocks[block_num]).then(() => { nextBlock = block_num + 1; blocks.completed = blocks.processing; blocks.processing = 0; delete blocks[block_num]; if (blocks[nextBlock]) blocks.manage(nextBlock); }); } else if (block_num > nextBlock) { if (blocks[nextBlock]) { processBlock(blocks[nextBlock]).then(() => { delete blocks[nextBlock]; nextBlock++; blocks.completed = blocks.processing; blocks.processing = 0; if (blocks[nextBlock]) blocks.manage(nextBlock); }); } else if (!blocks[nextBlock]) { getBlock(nextBlock); } if (!isStreaming || behind < 5) { getHeadOrIrreversibleBlockNumber(function (result) { head_block=result; if (nextBlock < result - 3) { behind=result - nextBlock; beginBlockComputing(); } else if (!isStreaming) { beginBlockStreaming(); } }); } } blocks.ensure(block_num); }, };\n\nI've removede catch blocks from all of this to make it more compact... But it is just the same as it is in the file. Define some scoped variables. Blocks is a small block buffer, that will hold up to ~1000 blocks to be processed. Due to the nature of all of the processing is single threaded. One block at a time is processed, and each transaction in the block is processed in the same way. clean() removes all the blocks that have been processed from the buffer. ensure() will try to catch up to live if the block stream drops out for any reason. v{} will hold virtual operations if any virtual operations need to be looked at. manage() will process a block and based on the block number, a few house keeping tasks.\n\nvar stopping=false; function getHeadOrIrreversibleBlockNumber(callback) { client.database.getDynamicGlobalProperties().then(function (result) { callback(result.last_irreversible_block_num); }); } function getVops(bn) { return new Promise((resolve, reject)=> { fetch(client.currentAddress, { body: `{\"jsonrpc\":\"2.0\", \"method\":\"condenser_api.get_ops_in_block\", \"params\":[${bn},true], \"id\":1}`, headers: { \"Content-Type\": \"application/x-www-form-urlencoded\", \"User-Agent\": `${prefix}HoneyComb/${account}`, }, method: \"POST\", }) .then((res) => res.json()) .then((json) => { if (!json.result) { blocks.v[bn] = []; blocks.manage(bn, true); } else { blocks.v[bn] = json.result; blocks.manage(bn, true); } }) }); }\n\nThe lack of a get virtual ops in range call really cramps the style of this processor for a few things... This might be some work HAF is custom made to handle.\n\nfunction isAtRealTime(computeBlock) { getHeadOrIrreversibleBlockNumber(function (result) { head_block = result; if (nextBlock >= result) { beginBlockStreaming(); } else { behind = result - nextBlock; computeBlock(); } }); }\n\nHow to shift between get block range calls, and d-hives multiple API block stream call.\n\nfunction getBlockNumber(bln) { client.database .getBlock(bln) .then((result) => { if (result) { blocks[parseInt(result.block_id.slice(0, 8), 16)] = result; blocks.manage(bln); } }) }\n\nHere we can see how a block get's loaded into the block buffer. This call only comes from blocks.ensure()\n\nfunction getBlock(bn) { if (behind && !stopping) gbr(bn, behind > 100 ? 100 : behind, 0); if (stopping) stream = undefined; else if (!stopping) gb(bn, 0); }\n\nThis call is just a traffic cop to keep things moving depending on conditions.\n\nfunction gb(bln, at) { //getBlock( block number, attempt) if (blocks[bln]) { blocks.manage(bln); return; } else if (blocks.requests.last_block == bln) return; if (bln < TXID.saveNumber + 50) { blocks.requests.last_block=bln; client.database .getBlock(bln) .then((result)=> { blocks[parseInt(result.block_id.slice(0, 8), 16)] = result; blocks.manage(bln); }) .catch((err) => { if (at < 3) { setTimeout(()=> { gbr(bln, at + 1); }, Math.pow(10, at + 1)); } else { console.log(\"Get block attempt:\", at, client.currentAddress); } }); } else { setTimeout(() => { gb(bln, at + 1); }, Math.pow(10, at + 1)); } }\n\nTries to prevent extra API calls, won't make single block API calls if the request differs from the current processed block by more than 50.\n\nfunction gbr(bln, count, at) { if (!at && blocks.requests.last_range > bln) return; //prevents double API calls, unless it's a reattempt console.log({ bln, count, at }); if (!at) blocks.requests.last_range = bln + count - 1; //doesn't update the buffer get head for reattempts fetch(client.currentAddress, { body: `{\"jsonrpc\":\"2.0\", \"method\":\"block_api.get_block_range\", \"params\":{\"starting_block_num\": ${bln}, \"count\": ${count}}, \"id\":1}`, headers: { \"Content-Type\": \"application/x-www-form-urlencoded\", \"User-Agent\": `${prefix}HoneyComb/${account}`, //tattles on a user for heavy API calls, hopefully to prevent DDoS Bans }, method: \"POST\", }) .then((res) => res.json()) .then((result) => { try { var Blocks = result.result.blocks; for (var i = 0; i < Blocks.length; i++) { //range call blocks are in a slightly different configuration and need to be put into the streaming format const bkn=parseInt(Blocks[i].block_id.slice(0, 8), 16); for (var j=0; j < Blocks[i].transactions.length; j++) { Blocks[i].transactions[j].block_num=bkn; Blocks[i].transactions[j].transaction_id=Blocks[i].transaction_ids[j]; Blocks[i].transactions[j].transaction_num=j; var ops=[]; for ( var k=0; k < Blocks[i].transactions[j].operations.length; k++ ) { ops.push([ Blocks[i].transactions[j].operations[k].type.replace( \"_operation\", \"\" ), Blocks[i].transactions[j].operations[k].value, ]); } Blocks[i].transactions[j].operations=ops; blocks[bkn]=Blocks[i]; } } blocks.manage(bln); } catch (e) { //exponential back off and retry attempter if (at < 3) { setTimeout(()=> { gbr(bln, count, at + 1); }, Math.pow(10, at + 1)); } } }) .catch((err) => { if (at < 3) { setTimeout(()=> { gbr(bln, count, at + 1); }, Math.pow(10, at + 1)); } }); }\n\nRange calls are probably the hardest to understand by looking at the code... Roughly 1 in 50 range calls fail to my node on the first attempt, as well as range blocks have a different structure than streamed blocks. These extra checks are mostly to deal with this.\n\nfunction beginBlockComputing() { var blockNum = nextBlock; // Helper variable to prevent race condition blocks.ensure(nextBlock); getBlock(blockNum); } function beginBlockStreaming() { isStreaming = true; onStreamingStart(); stream = client.blockchain.getBlockStream(); stream.on(\"data\", function (Block) { var blockNum = parseInt(Block.block_id.slice(0, 8), 16); blocks[blockNum] = Block; blocks.requests.last_block = blockNum; blocks.requests.last_range = blockNum; blocks.manage(blockNum); }); stream.on(\"end\", function () { console.error( \"Block stream ended unexpectedly. Restarting block computing.\" ); beginBlockComputing(); stream = undefined; }); stream.on(\"error\", function (err) { beginBlockComputing(); stream = undefined; }); }\n\nStart and stop block streams, and keep the get block variables correct.\n\nfunction transactional(ops, i, pc, num, block, vops) { if (ops.length) { doOp(ops[i], [ops, i, pc, num, block, vops]) .then((v) => { if (ops.length > i + 1) { transactional(v[0], v[1] + 1, v[2], v[3], v[4], v[5]); } else { onNewBlock(num, v, v[4].witness_signature, { timestamp: v[4].timestamp, block_id: v[4].block_id, block_number: num, }) .then((r) => { pc[0](pc[2]); }) // } } }) .catch((e) => { pc[1](e); }); } else if (parseInt(block.block_id.slice(0, 8), 16) != num) { pc[0](); } else { onNewBlock(num, pc, block.witness_signature, { timestamp: block.timestamp, block_id: block.block_id, block_number: num, }) .then((r) => { r[0](); }) .catch((e) => { pc[1](e); }); }\n\nAttaches header data to each transaction so the processor can be block agnostic when computing. block number, witness signature, timestamp... all have important uses in NFT creation, cron jobs, and DEX ordering. Transactional builds a promise chain that effectively gives each transaction a database lock. Transaction_ids let us have a way to verify hive transactions were processed on the layer 2.\n\nfunction doOp(op, pc) { return new Promise((resolve, reject) => { if (op.length == 4) { onCustomJsonOperation[op[0]](op[1], op[2], op[3], [ resolve, reject, pc, ]); } else if (op.length == 2) { onOperation[op[0]](op[1], [resolve, reject, pc]); } }); }\n\nCalling the appropriate smart contracts by name. These can be triggered from any operation, such as a send, or a vote.\n\nfunction doVop(op, pc) { return new Promise((resolve, reject) => { console.log(op, pc); onVOperation[op[0]](op[1], [resolve, reject, pc]); }); } }\n\nCurently unused, but will do the same for virtual operations like DHF payout.\n\nfunction processBlock(Block, Pvops) { return new Promise((resolve, reject) => { var transactions = Block.transactions; let ops = []; if (parseInt(Block.block_id.slice(0, 8), 16) === nextBlock) { for (var i = 0; i < transactions.length; i++) { for (var j=0; j < transactions[i].operations.length; j++) { var op=transactions[i].operations[j]; if (op[0]===\"custom_json\") { //console.log('check') if (typeof onCustomJsonOperation[op[1].id]===\"function\") { var ip=JSON.parse(op[1].json), from=op[1].required_posting_auths[0], active=false; if ( typeof ip===\"string\" || typeof ip===\"number\" || Array.isArray(ip) ) ip={}; ip.transaction_id=transactions[i].transaction_id; ip.block_num=transactions[i].block_num; ip.timestamp=Block.timestamp; ip.prand=Block.witness_signature; if (!from) { from=op[1].required_auths[0]; active=true; } ops.push([op[1].id, ip, from, active]); //onCustomJsonOperation[op[1].id](ip, from, active); } } else if (onOperation[op[0]] !==undefined) { op[1].transaction_id=transactions[i].transaction_id; op[1].block_num=transactions[i].block_num; op[1].timestamp=Block.timestamp; op[1].prand=Block.witness_signature; ops.push([op[0], op[1]]); //onOperation[op[0]](op[1]); } } } transactional(ops, 0, [resolve, reject], nextBlock, Block, Pvops); } }); }\n\nBreaking the block down to transactions.\n\nreturn { /* Determines a state update to be called when a new operation of the id operationId (with added prefix) is computed. */ on: function (operationId, callback) { onCustomJsonOperation[prefix + operationId]=callback; }, onOperation: function (type, callback) { onOperation[type]=callback; }, onNoPrefix: function (operationId, callback) { onCustomJsonOperation[operationId]=callback; }, /* Determines a state update to be called when a new block is computed. */ onBlock: function (callback) { onNewBlock=callback; }, start: function () { beginBlockComputing(); isStreaming=false; }, getCurrentBlockNumber: function () { return nextBlock; }, isStreaming: function () { return isStreaming; }, onStreamingStart: function (callback) { onStreamingStart=callback; }, stop: function (callback) { if (isStreaming) { isStreaming=false; stopping=true; stream=undefined; blocks.stop(); setTimeout(callback, 1000); } else { blocks.stop(); stopping=true; stopCallback=callback; } }, }; };\n\nAnd finally, allowing the processor to be fed smart contracts and logic outside of the module.", "articleBodyHtml" : "
\n\n

Brain Melting Code Review

\n\n

Novemblog 12

\n\n
\"Explosion
\n\n

HoneyComb - block processor

\n\n

This is... well. A block processor. It gets blocks from an API and matches any transactions to smart contracts.
\nThere is a fair amount going on here... and I guess it's best to break it into a few pieces.

\n\n
const fetch = require(\"node-fetch\");\nconst { TXID } = require(\"./index\");\nmodule.exports = function (\n  client,\n  nextBlock = 1,\n  prefix = \"dlux_\",\n  account = \"null\",\n  vOpsRequired = false\n) {\n  var onCustomJsonOperation = {}; // Stores the function to be run for each operation id.\n  var onOperation = {};\n\n  var onNewBlock = function () {};\n  var onStreamingStart = function () {};\n  var behind = 0;\n  var head_block;\n  var isStreaming;\n  var vOps = false;\n  var stream;\n  var blocks = {\n    processing: 0,\n    completed: nextBlock,\n    stop: function () {\n      blocks.clean(1);\n    },\n    ensure: function (last) {\n      setTimeout(() => {\n        if (!blocks.processing && blocks.completed == last) {\n          getBlockNumber(nextBlock);\n          if (!(last % 3))\n            getHeadOrIrreversibleBlockNumber(function (result) {\n              if (nextBlock < result - 5) {\n                behind = result - nextBlock;\n                beginBlockComputing();\n              } else if (!isStreaming) {\n                beginBlockStreaming();\n              }\n            });\n        }\n      }, 1000);\n    },\n    clean: function (stop = false) {\n      var blockNums = Object.keys(blocks);\n      for (var i = 0; i < blockNums.length; i++) {\n        if (\n          (parseInt(blockNums[i]) && parseInt(blockNums[i]) < nextBlock - 1) ||\n          (stop && parseInt(blockNums[i]))\n        ) {\n          delete blocks[blockNums[i]];\n          if (vOps) delete blocks[blockNums.v[i]];\n        }\n      }\n      var blockNums = Object.keys(blocks.v);\n      for (var i = 0; i < blockNums.length; i++) {\n        if (\n          (parseInt(blockNums[i]) && parseInt(blockNums[i]) < nextBlock - 1) ||\n          (stop && parseInt(blockNums[i]))\n        ) {\n          delete blocks.v[blockNums[i]];\n        }\n      }\n    },\n    v: {},\n    requests: {\n      last_range: 0,\n      last_block: 0,\n    },\n    manage: function (block_num, vOp = false) {\n      if (!head_block || block_num > head_block || !(block_num % 100))\n        getHeadOrIrreversibleBlockNumber(function (result) {\n          head_block = result;\n          behind = result - nextBlock;\n        });\n      if (\n        !(block_num % 100) &&\n        head_block > blocks.requests.last_range + 200 &&\n        Object.keys(blocks).length < 1000\n      ) {\n        gbr(blocks.requests.last_range + 1, 100, 0);\n      }\n      if (\n        !(block_num % 100) &&\n        head_block - blocks.requests.last_range + 1100\n      ) {\n        gbr(blocks.requests.last_range + 1, 100, 0);\n      }\n      if (!(block_num % 100)) blocks.clean();\n      if (blocks.processing) {\n        setTimeout(() => {\n          blocks.manage(block_num);\n        }, 100);\n        blocks.clean();\n      } else if (vOps && !blocks.v[block_num]) return;\n      else if (vOp && !blocks[block_num]) return;\n      else if (blocks[block_num] && block_num == nextBlock) {\n        blocks.processing = nextBlock;\n        processBlock(blocks[block_num]).then(() => {\n          nextBlock = block_num + 1;\n          blocks.completed = blocks.processing;\n          blocks.processing = 0;\n          delete blocks[block_num];\n          if (blocks[nextBlock]) blocks.manage(nextBlock);\n        });\n      } else if (block_num > nextBlock) {\n        if (blocks[nextBlock]) {\n          processBlock(blocks[nextBlock]).then(() => {\n            delete blocks[nextBlock];\n            nextBlock++;\n            blocks.completed = blocks.processing;\n            blocks.processing = 0;\n            if (blocks[nextBlock]) blocks.manage(nextBlock);\n          });\n        } else if (!blocks[nextBlock]) {\n          getBlock(nextBlock);\n        }\n        if (!isStreaming || behind < 5) {\n          getHeadOrIrreversibleBlockNumber(function (result) {\n            head_block = result;\n            if (nextBlock < result - 3) {\n              behind = result - nextBlock;\n              beginBlockComputing();\n            } else if (!isStreaming) {\n              beginBlockStreaming();\n            }\n          });\n        }\n      }\n      blocks.ensure(block_num);\n    },\n  };\n\n
\n\n

I've removede catch blocks from all of this to make it more compact... But it is just the same as it is in the file. Define some scoped variables. Blocks is a small block buffer, that will hold up to ~1000 blocks to be processed. Due to the nature of all of the processing is single threaded. One block at a time is processed, and each transaction in the block is processed in the same way. clean() removes all the blocks that have been processed from the buffer. ensure() will try to catch up to live if the block stream drops out for any reason. v{} will hold virtual operations if any virtual operations need to be looked at. manage() will process a block and based on the block number, a few house keeping tasks.

\n\n
  var stopping = false;\n\n  function getHeadOrIrreversibleBlockNumber(callback) {\n    client.database.getDynamicGlobalProperties().then(function (result) {\n      callback(result.last_irreversible_block_num);\n    });\n  }\n\n  function getVops(bn) {\n    return new Promise((resolve, reject) => {\n      fetch(client.currentAddress, {\n        body: `{\"jsonrpc\":\"2.0\", \"method\":\"condenser_api.get_ops_in_block\", \"params\":[${bn},true], \"id\":1}`,\n        headers: {\n          \"Content-Type\": \"application/x-www-form-urlencoded\",\n          \"User-Agent\": `${prefix}HoneyComb/${account}`,\n        },\n        method: \"POST\",\n      })\n        .then((res) => res.json())\n        .then((json) => {\n          if (!json.result) {\n            blocks.v[bn] = [];\n            blocks.manage(bn, true);\n          } else {\n            blocks.v[bn] = json.result;\n            blocks.manage(bn, true);\n          }\n        })\n    });\n  }\n\n
\n\n

The lack of a get virtual ops in range call really cramps the style of this processor for a few things... This might be some work HAF is custom made to handle.

\n\n
  function isAtRealTime(computeBlock) {\n    getHeadOrIrreversibleBlockNumber(function (result) {\n      head_block = result;\n      if (nextBlock >= result) {\n        beginBlockStreaming();\n      } else {\n        behind = result - nextBlock;\n        computeBlock();\n      }\n    });\n  }\n\n
\n\n

How to shift between get block range calls, and d-hives multiple API block stream call.

\n\n
  function getBlockNumber(bln) {\n    client.database\n      .getBlock(bln)\n      .then((result) => {\n        if (result) {\n          blocks[parseInt(result.block_id.slice(0, 8), 16)] = result;\n          blocks.manage(bln);\n        }\n      })\n  }\n
\n\n

Here we can see how a block get's loaded into the block buffer. This call only comes from blocks.ensure()

\n\n
  function getBlock(bn) {\n    if (behind && !stopping) gbr(bn, behind > 100 ? 100 : behind, 0);\n    if (stopping) stream = undefined;\n    else if (!stopping) gb(bn, 0);\n  }\n
\n\n

This call is just a traffic cop to keep things moving depending on conditions.

\n\n
  function gb(bln, at) { //getBlock( block number, attempt)\n    if (blocks[bln]) {\n      blocks.manage(bln);\n      return;\n    } else if (blocks.requests.last_block == bln) return;\n    if (bln < TXID.saveNumber + 50) {\n      blocks.requests.last_block = bln;\n      client.database\n        .getBlock(bln)\n        .then((result) => {\n          blocks[parseInt(result.block_id.slice(0, 8), 16)] = result;\n          blocks.manage(bln);\n        })\n        .catch((err) => {\n          if (at < 3) {\n            setTimeout(() => {\n              gbr(bln, at + 1);\n            }, Math.pow(10, at + 1));\n          } else {\n            console.log(\"Get block attempt:\", at, client.currentAddress);\n          }\n        });\n    } else {\n      setTimeout(() => {\n        gb(bln, at + 1);\n      }, Math.pow(10, at + 1));\n    }\n  }\n\n
\n\n

Tries to prevent extra API calls, won't make single block API calls if the request differs from the current processed block by more than 50.

\n\n
  function gbr(bln, count, at) {\n    if (!at && blocks.requests.last_range > bln) return; //prevents double API calls, unless it's a reattempt\n    console.log({ bln, count, at });\n    if (!at) blocks.requests.last_range = bln + count - 1; //doesn't update the buffer get head for reattempts\n    fetch(client.currentAddress, {\n      body: `{\"jsonrpc\":\"2.0\", \"method\":\"block_api.get_block_range\", \"params\":{\"starting_block_num\": ${bln}, \"count\": ${count}}, \"id\":1}`,\n      headers: {\n        \"Content-Type\": \"application/x-www-form-urlencoded\",\n        \"User-Agent\": `${prefix}HoneyComb/${account}`, //tattles on a user for heavy API calls, hopefully to prevent DDoS Bans\n      },\n      method: \"POST\",\n    })\n      .then((res) => res.json())\n      .then((result) => {\n        try {\n          var Blocks = result.result.blocks;\n          for (var i = 0; i < Blocks.length; i++) { //range call blocks are in a slightly different configuration and need to be put into the streaming format\n            const bkn = parseInt(Blocks[i].block_id.slice(0, 8), 16);\n            for (var j = 0; j < Blocks[i].transactions.length; j++) {\n              Blocks[i].transactions[j].block_num = bkn;\n              Blocks[i].transactions[j].transaction_id =\n                Blocks[i].transaction_ids[j];\n              Blocks[i].transactions[j].transaction_num = j;\n              var ops = [];\n              for (\n                var k = 0;\n                k < Blocks[i].transactions[j].operations.length;\n                k++\n              ) {\n                ops.push([\n                  Blocks[i].transactions[j].operations[k].type.replace(\n                    \"_operation\",\n                    \"\"\n                  ),\n                  Blocks[i].transactions[j].operations[k].value,\n                ]);\n              }\n              Blocks[i].transactions[j].operations = ops;\n              blocks[bkn] = Blocks[i];\n            }\n          }\n          blocks.manage(bln);\n        } catch (e) { //exponential back off and retry attempter\n          if (at < 3) {\n            setTimeout(() => {\n              gbr(bln, count, at + 1);\n            }, Math.pow(10, at + 1));\n          } \n        }\n      })\n      .catch((err) => {\n        if (at < 3) {\n          setTimeout(() => {\n            gbr(bln, count, at + 1);\n          }, Math.pow(10, at + 1));\n        }\n      });\n  }\n
\n\n

Range calls are probably the hardest to understand by looking at the code... Roughly 1 in 50 range calls fail to my node on the first attempt, as well as range blocks have a different structure than streamed blocks. These extra checks are mostly to deal with this.

\n\n
\n  function beginBlockComputing() {\n    var blockNum = nextBlock; // Helper variable to prevent race condition\n    blocks.ensure(nextBlock);\n    getBlock(blockNum);\n  }\n\n  function beginBlockStreaming() {\n    isStreaming = true;\n    onStreamingStart();\n    stream = client.blockchain.getBlockStream();\n    stream.on(\"data\", function (Block) {\n      var blockNum = parseInt(Block.block_id.slice(0, 8), 16);\n      blocks[blockNum] = Block;\n      blocks.requests.last_block = blockNum;\n      blocks.requests.last_range = blockNum;\n      blocks.manage(blockNum);\n    });\n    stream.on(\"end\", function () {\n      console.error(\n        \"Block stream ended unexpectedly. Restarting block computing.\"\n      );\n      beginBlockComputing();\n      stream = undefined;\n    });\n    stream.on(\"error\", function (err) {\n      beginBlockComputing();\n      stream = undefined;\n    });\n  }\n
\n\n

Start and stop block streams, and keep the get block variables correct.

\n\n
  function transactional(ops, i, pc, num, block, vops) {\n    if (ops.length) {\n      doOp(ops[i], [ops, i, pc, num, block, vops])\n        .then((v) => {\n          if (ops.length > i + 1) {\n            transactional(v[0], v[1] + 1, v[2], v[3], v[4], v[5]);\n          } else {\n            onNewBlock(num, v, v[4].witness_signature, {\n              timestamp: v[4].timestamp,\n              block_id: v[4].block_id,\n              block_number: num,\n            })\n              .then((r) => {\n                pc[0](pc[2]);\n              })\n            // }\n          }\n        })\n        .catch((e) => {\n          pc[1](e);\n        });\n    } else if (parseInt(block.block_id.slice(0, 8), 16) != num) {\n      pc[0]();\n    } else {\n      onNewBlock(num, pc, block.witness_signature, {\n        timestamp: block.timestamp,\n        block_id: block.block_id,\n        block_number: num,\n      })\n        .then((r) => {\n          r[0]();\n        })\n        .catch((e) => {\n          pc[1](e);\n        });\n    }\n
\n\n

Attaches header data to each transaction so the processor can be block agnostic when computing. block number, witness signature, timestamp... all have important uses in NFT creation, cron jobs, and DEX ordering. Transactional builds a promise chain that effectively gives each transaction a database lock. Transaction_ids let us have a way to verify hive transactions were processed on the layer 2.

\n\n
    function doOp(op, pc) {\n      return new Promise((resolve, reject) => {\n        if (op.length == 4) {\n          onCustomJsonOperation[op[0]](op[1], op[2], op[3], [\n            resolve,\n            reject,\n            pc,\n          ]);\n        } else if (op.length == 2) {\n          onOperation[op[0]](op[1], [resolve, reject, pc]);\n        }\n      });\n    }\n
\n\n

Calling the appropriate smart contracts by name. These can be triggered from any operation, such as a send, or a vote.

\n\n
    function doVop(op, pc) {\n      return new Promise((resolve, reject) => {\n        console.log(op, pc);\n        onVOperation[op[0]](op[1], [resolve, reject, pc]);\n      });\n    }\n  }\n\n
\n\n

Curently unused, but will do the same for virtual operations like DHF payout.

\n\n
\n  function processBlock(Block, Pvops) {\n    return new Promise((resolve, reject) => {\n      var transactions = Block.transactions;\n      let ops = [];\n      if (parseInt(Block.block_id.slice(0, 8), 16) === nextBlock) {\n        for (var i = 0; i < transactions.length; i++) {\n          for (var j = 0; j < transactions[i].operations.length; j++) {\n            var op = transactions[i].operations[j];\n            if (op[0] === \"custom_json\") {\n              //console.log('check')\n              if (typeof onCustomJsonOperation[op[1].id] === \"function\") {\n                var ip = JSON.parse(op[1].json),\n                  from = op[1].required_posting_auths[0],\n                  active = false;\n                if (\n                  typeof ip === \"string\" ||\n                  typeof ip === \"number\" ||\n                  Array.isArray(ip)\n                )\n                  ip = {};\n                ip.transaction_id = transactions[i].transaction_id;\n                ip.block_num = transactions[i].block_num;\n                ip.timestamp = Block.timestamp;\n                ip.prand = Block.witness_signature;\n                if (!from) {\n                  from = op[1].required_auths[0];\n                  active = true;\n                }\n                ops.push([op[1].id, ip, from, active]); //onCustomJsonOperation[op[1].id](ip, from, active);\n              }\n            } else if (onOperation[op[0]] !== undefined) {\n              op[1].transaction_id = transactions[i].transaction_id;\n              op[1].block_num = transactions[i].block_num;\n              op[1].timestamp = Block.timestamp;\n              op[1].prand = Block.witness_signature;\n              ops.push([op[0], op[1]]); //onOperation[op[0]](op[1]);\n            }\n          }\n        }\n        transactional(ops, 0, [resolve, reject], nextBlock, Block, Pvops);\n      }\n    });\n  }\n
\n\n

Breaking the block down to transactions.

\n\n
  return {\n    /*\n          Determines a state update to be called when a new operation of the id\n            operationId (with added prefix) is computed.\n        */\n    on: function (operationId, callback) {\n      onCustomJsonOperation[prefix + operationId] = callback;\n    },\n\n    onOperation: function (type, callback) {\n      onOperation[type] = callback;\n    },\n\n    onNoPrefix: function (operationId, callback) {\n      onCustomJsonOperation[operationId] = callback;\n    },\n\n    /*\n          Determines a state update to be called when a new block is computed.\n        */\n    onBlock: function (callback) {\n      onNewBlock = callback;\n    },\n\n    start: function () {\n      beginBlockComputing();\n      isStreaming = false;\n    },\n\n    getCurrentBlockNumber: function () {\n      return nextBlock;\n    },\n\n    isStreaming: function () {\n      return isStreaming;\n    },\n    onStreamingStart: function (callback) {\n      onStreamingStart = callback;\n    },\n\n    stop: function (callback) {\n      if (isStreaming) {\n        isStreaming = false;\n        stopping = true;\n        stream = undefined;\n        blocks.stop();\n        setTimeout(callback, 1000);\n      } else {\n        blocks.stop();\n        stopping = true;\n        stopCallback = callback;\n      }\n    },\n  };\n};\n\n
\n\n

And finally, allowing the processor to be fed smart contracts and logic outside of the module.

\n\n
", "canonicalUrl": "https://peakd.com/dev/@disregardfiat/honeycomb-processor-update"},{"url": "https://hive.blog/movie/@disregardfiat/black-panther-movie-review", "probability": 0.9404775, "headline": "Black Panther - Movie Review", "datePublished": "2022-11-24T04:24:04.851897", "datePublishedRaw": "5 months ago", "author": "disregardfiat", "authorsList": ["disregardfiat"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://data.dlux.io/pfp/disregardfiat?hf-00", "description": "Review of Black Panther - Wakanda Forever by disregardfiat", "articleBody": "Black Panther - Wakanda Forever\n\n30% done with blog everyday November\n\nJust walking thru the mall yesterday and we recieved a couple of free Black Panther tickets... and decided to go check it out.\n\nIRL Chadwick Boseman, has succumbed to cancer, which left the sequel to a very well received movie with out it's biggest star, and titular role The Black Panther, King of Wakanda T'Challa. Setting up the events where last movie where the heart shaped herb has all been destroyed... the power of the Black Panther is feared to have been lost forever.\n\nIn general I liked this movie a whole lot more than the last one. There were several times in the last movie where I felt like I shouldn't be cheering at all because of the \"traditional\" way things were handled in Wakandan politics. I'm very happy to say I have few to none of those criticisms this time around.\n\nThe director made an engaging story, true to the isolationist motif of Wakanda, introduced new bad guys, and dealt with them... with out pulling other Marvel characters into the scenes. The MCU has so many heroes now a days that finding a problem that isn't solvable is pretty hard and more often frustrating to watch as you think... why doesn't hero x just do thing y and call it a day.\n\nMost of the movie lacks the Black Panther, and I won't spoil how it ends. Needless to say at this point: If you enjoyed the first movie at all, I think you'll enjoy this one even more. Quite a rarity for a sequel, and unheard of for one that is missing it's biggest star.", "articleBodyHtml": "
\n\n

Black Panther - Wakanda Forever

\n\n

30% done with blog everyday November

\n\n

Just walking thru the mall yesterday and we recieved a couple of free Black Panther tickets... and decided to go check it out.

\n\n

IRL Chadwick Boseman, has succumbed to cancer, which left the sequel to a very well received movie with out it's biggest star, and titular role The Black Panther, King of Wakanda T'Challa. Setting up the events where last movie where the heart shaped herb has all been destroyed... the power of the Black Panther is feared to have been lost forever.

\n\n

In general I liked this movie a whole lot more than the last one. There were several times in the last movie where I felt like I shouldn't be cheering at all because of the \"traditional\" way things were handled in Wakandan politics. I'm very happy to say I have few to none of those criticisms this time around.

\n\n

The director made an engaging story, true to the isolationist motif of Wakanda, introduced new bad guys, and dealt with them... with out pulling other Marvel characters into the scenes. The MCU has so many heroes now a days that finding a problem that isn't solvable is pretty hard and more often frustrating to watch as you think... why doesn't hero x just do thing y and call it a day.

\n\n

Most of the movie lacks the Black Panther, and I won't spoil how it ends. Needless to say at this point: If you enjoyed the first movie at all, I think you'll enjoy this one even more. Quite a rarity for a sequel, and unheard of for one that is missing it's biggest star.

\n\n
", "canonicalUrl": "https://peakd.com/movie/@disregardfiat/black-panther-movie-review"},{"url": "https://hive.blog/defi/@disregardfiat/honeycomb-dex-vs-ftx", "probability": 0.9250582, "headline": "HoneyComb Dex vs FTX", "datePublished": "2022-11-24T04:24:05.421600", "datePublishedRaw": "5 months ago", "author": "disregardfiat", "authorsList": ["disregardfiat"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://pbs.twimg.com/media/FhKkFyQXEAA_sUv.jpg", "images": ["https://images.hive.blog/768x0/https://pbs.twimg.com/media/FhKkFyQXEAA_sUv.jpg", "https://images.hive.blog/768x0/https://pbs.twimg.com/media/FhKkFyUWAAEvj0f.jpg"], "description": "FTX sucks, what's next? by disregardfiat", "articleBody": "Another Downturn Another Lesson\n\nDay 10 in NovemBlog\n\nWe're seeing another very public meltdown from a central exchange. As it turns out having an exchange coin is still a bad idea. Leaving funds on an exchange is still a bad idea. Trusting anybody with your crypto is still a bad idea. \"Fool me once, shame on you; fool me twice, shame on me.\" It's unfortunate that the regulating bodies such as the SEC worked with these bad actors... who basically do the exact same things as banks. They use capital \"efficiently\" but dangerously.\n\nListing a token on an centralized exchange is quite a process... a shameful one. Listing fees, liquidity, centralized wallets and attack vectors. @starkerz gave me quite the compliment in a recent conversation: ~\"I'm glad you don't trust anybody.\"\n\nAlternatives\n\nThere are alternatives to centralized exchanges, decentralized exchanges. Listing fees are generally paid in an exchange tokens as well, to prevent spam. The public nature of most of these exchange tokens make it hard to hide the kind of fraud going on in opaque markets. They do have their disadvantages though, and we've seen DeFi hit with a tremendous amount of hacks and exploits while this technology is in it's infancy. Algorithmic stable coins and automated market makers have shown their weaknesses.\n\nHere on HIVE we have an internal market and an algorithmic, soft-pegged, stablish coin. The stablish nature of HBD really set's it apart. With the haircut rule and some time locked exchange contracts it currently seems impossible to have a coin-wrecking hyper-inflationary spiral. One could argue we have seen it as bad as it can get when HBD was traded under a dollar for a prolonged time. With it's current 20% savings APR as long as HBD crosses the $1 line inside of 30 days it's as stable as it needs to be... and it's been far more stable than that since HF25. Kudo's @dan @blocktrades and @smooth for these brilliant improvements.\n\nHoneyComb (dlux at the time) was the first token I'm aware of that built it's own DEX outside of the ecosystem it's based on. 0x and swap protocols on ETH made it possible to swap different assets in ETH contracts but... HoneyComb is built as a layer 2 to Hive, which lacks a smart contract platform. As such it went through 2 iterations. The first was an atomic swap protocol... which to my knowledge suffered exactly 0 losses or hacks, and the current autonomously managed multi-sig paradigm, which allows partial fill orders.\n\nCons\n\nDEXs and staking in general suffer from the possibility of impermanent loss. When the staked tokens fall in value relative to other currencies. Most people on Hive will understand this, as their staked Hive Power is worth ~30% less than it was a few days ago before FTX blew up. Until you actually trade the token away this loss doesn't exist. 100 Hive power will still grant you 100 Hive Power worth of resource credits, and still allocated the same percentage of the reward pool thru voting. One thing Hive DeFi doesn't have to deal with is front running; where seeing a trade in the MemPool allows somebody to place the same trade with a higher fee to nullify or lessen the first trade.\n\nFront running attacks are possible on Hive, but it would have to be executed by the witness actively signing blocks... which would probably have to be a top 20 witness, and therefore only subject to happen in 5% of the blocks... until they get voted out of the top 20 as you would be able to see these attacks in the block/ subsequent blocks. But hopefully there are enough incentives to stay in the top 20 where witnesses wouldn't be trying to engage in this behavior. Once again, the incentive structure of DPoS really shines through.\n\nPros\n\nSaving the best for last. HoneyComb is enabling a new kind of growth paradigm. When the SPK network started their airdrop they asked the community to run some HoneyComb nodes. The Airdrop wasn't a portion of a premine... Hive was the \"Premine.\" THE DHF paying for the developement. Additionally, users had to interact with the system over the course of a year to draw their full token potential. The developers didn't get a larger share, the founders didn't get a larger share... no centralized exchange had to purchase or be given a wad of token liquidity, no listing fees had to be paid, no single person had to be trusted with moving a ledger from ETH-20 to a native coin. Outside of bitcoin this might be the \"purest\" from of decentralized launch yet. The node operators might have claimed enough to be relevant, might have purchased enough to be relevant, or might have dropped off after understanding the economics of the system. The running cost of a node for a month was less than any operation on ETH; a failed trade on SushiSwap would likely have cost more than a year of running a SPK-CC node.\n\nHoneyComb safely accomplishes this by maintaining it's order book smaller than the market value of it's collective collateral. Not having to worry about outside exchanges, much like the internal market place on Hive, allows it to precisely know which value to trust. If somebody wants to swing the market value for one of two reasons, colluding to steal multi-sig funds or get preferential prices, neither will be economically feasible.\n\nLoss?\n\nI personally see abundance in our future... but abundance also means a loss for some people. For instance nobody charges you for breathing the air because of it's abundance. If we create a shortage of something like clean water then some people have a vector to fleece you for what should have been yours. Digitally speaking we are creating an abundance of collective memory and story telling. The price of storage will hopefully be a race to the bottom where it's as close to real cost as possible. If you are interested in cat videos, IPFS will allow you to host those cat videos and participate in a network that makes distribution and storage as free as possible(owning the equipment that let's you access it), the same goes for any content you like to consume. In the long run I hope our endeavors benefit humanity... but playing with the bleeding edge of technology means some cuts along the way.\n\nNone of this is financial advice of course, just explaining systems and hoping that there are fewer cuts in the future.", "articleBodyHtml": "
\n\n

Another Downturn Another Lesson

\n\n

Day 10 in NovemBlog

\n\n

We're seeing another very public meltdown from a central exchange. As it turns out having an exchange coin is still a bad idea. Leaving funds on an exchange is still a bad idea. Trusting anybody with your crypto is still a bad idea. \"Fool me once, shame on you; fool me twice, shame on me.\" It's unfortunate that the regulating bodies such as the SEC worked with these bad actors... who basically do the exact same things as banks. They use capital \"efficiently\" but dangerously.

\n\n


\n

\n\n

Listing a token on an centralized exchange is quite a process... a shameful one. Listing fees, liquidity, centralized wallets and attack vectors. @starkerz gave me quite the compliment in a recent conversation: ~\"I'm glad you don't trust anybody.\"

\n\n

Alternatives

\n\n

There are alternatives to centralized exchanges, decentralized exchanges. Listing fees are generally paid in an exchange tokens as well, to prevent spam. The public nature of most of these exchange tokens make it hard to hide the kind of fraud going on in opaque markets. They do have their disadvantages though, and we've seen DeFi hit with a tremendous amount of hacks and exploits while this technology is in it's infancy. Algorithmic stable coins and automated market makers have shown their weaknesses.

\n\n

Here on HIVE we have an internal market and an algorithmic, soft-pegged, stablish coin. The stablish nature of HBD really set's it apart. With the haircut rule and some time locked exchange contracts it currently seems impossible to have a coin-wrecking hyper-inflationary spiral. One could argue we have seen it as bad as it can get when HBD was traded under a dollar for a prolonged time. With it's current 20% savings APR as long as HBD crosses the $1 line inside of 30 days it's as stable as it needs to be... and it's been far more stable than that since HF25. Kudo's @dan @blocktrades and @smooth for these brilliant improvements.

\n\n

HoneyComb (dlux at the time) was the first token I'm aware of that built it's own DEX outside of the ecosystem it's based on. 0x and swap protocols on ETH made it possible to swap different assets in ETH contracts but... HoneyComb is built as a layer 2 to Hive, which lacks a smart contract platform. As such it went through 2 iterations. The first was an atomic swap protocol... which to my knowledge suffered exactly 0 losses or hacks, and the current autonomously managed multi-sig paradigm, which allows partial fill orders.

\n\n

Cons

\n\n

DEXs and staking in general suffer from the possibility of impermanent loss. When the staked tokens fall in value relative to other currencies. Most people on Hive will understand this, as their staked Hive Power is worth ~30% less than it was a few days ago before FTX blew up. Until you actually trade the token away this loss doesn't exist. 100 Hive power will still grant you 100 Hive Power worth of resource credits, and still allocated the same percentage of the reward pool thru voting. One thing Hive DeFi doesn't have to deal with is front running; where seeing a trade in the MemPool allows somebody to place the same trade with a higher fee to nullify or lessen the first trade.

\n\n

Front running attacks are possible on Hive, but it would have to be executed by the witness actively signing blocks... which would probably have to be a top 20 witness, and therefore only subject to happen in 5% of the blocks... until they get voted out of the top 20 as you would be able to see these attacks in the block/ subsequent blocks. But hopefully there are enough incentives to stay in the top 20 where witnesses wouldn't be trying to engage in this behavior. Once again, the incentive structure of DPoS really shines through.

\n\n

Pros

\n\n

Saving the best for last. HoneyComb is enabling a new kind of growth paradigm. When the SPK network started their airdrop they asked the community to run some HoneyComb nodes. The Airdrop wasn't a portion of a premine... Hive was the \"Premine.\" THE DHF paying for the developement. Additionally, users had to interact with the system over the course of a year to draw their full token potential. The developers didn't get a larger share, the founders didn't get a larger share... no centralized exchange had to purchase or be given a wad of token liquidity, no listing fees had to be paid, no single person had to be trusted with moving a ledger from ETH-20 to a native coin. Outside of bitcoin this might be the \"purest\" from of decentralized launch yet. The node operators might have claimed enough to be relevant, might have purchased enough to be relevant, or might have dropped off after understanding the economics of the system. The running cost of a node for a month was less than any operation on ETH; a failed trade on SushiSwap would likely have cost more than a year of running a SPK-CC node.

\n\n

HoneyComb safely accomplishes this by maintaining it's order book smaller than the market value of it's collective collateral. Not having to worry about outside exchanges, much like the internal market place on Hive, allows it to precisely know which value to trust. If somebody wants to swing the market value for one of two reasons, colluding to steal multi-sig funds or get preferential prices, neither will be economically feasible.

\n\n

Loss?

\n\n

I personally see abundance in our future... but abundance also means a loss for some people. For instance nobody charges you for breathing the air because of it's abundance. If we create a shortage of something like clean water then some people have a vector to fleece you for what should have been yours. Digitally speaking we are creating an abundance of collective memory and story telling. The price of storage will hopefully be a race to the bottom where it's as close to real cost as possible. If you are interested in cat videos, IPFS will allow you to host those cat videos and participate in a network that makes distribution and storage as free as possible(owning the equipment that let's you access it), the same goes for any content you like to consume. In the long run I hope our endeavors benefit humanity... but playing with the bleeding edge of technology means some cuts along the way.

\n\n

None of this is financial advice of course, just explaining systems and hoping that there are fewer cuts in the future.

\n\n
", "canonicalUrl": "https://peakd.com/defi/@disregardfiat/honeycomb-dex-vs-ftx"},{"url": "https://hive.blog/dev/@disregardfiat/proof-of-access-content-insertion", "probability": 0.9102454, "headline": "Proof of Access - Content Insertion", "datePublished": "2022-11-24T04:24:08.631996", "datePublishedRaw": "5 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23uR2pJT23A3zZaRvd4ZUxZyMACQWH27rzuyspQNefGjfgJx1aXe3cN3bkiEso7jBcnhg.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23uR2pJT23A3zZaRvd4ZUxZyMACQWH27rzuyspQNefGjfgJx1aXe3cN3bkiEso7jBcnhg.png"], "description": "Deep dive on placing content into the network. by disregardfiat", "articleBody": "Proof of Access\n\nIt's day 8 of blog everyday November, things have calmed down from moving and I'm back on my grind\n\nThe Hard Part\n\nA little while ago we posted a video of a few of us talking shop and reviewing some code. I mentioned some of the \"hard part\" was around the resource credit nature of BROCA tokens, and the handshake necessary to put content into the network. So here we are a few months later and a whole lot of thought, research, and prototyping later. It's time to talk through \"Content Insertion.\"\n\nMitigating Abuse\n\nOur Solution to this problem has to mitigate several abuse vectors. Let's list and explain them.\n\nPlacing illegal content into the network\nAttributing illegal content to somebody else on the network\nAttributing false meta data to content to draw more rewards\nDoublespend\n\nPlacing Illegal Content\n\nPeople are not likely to want to run infrastructure that can directly get them into trouble. The TOR network has this issue with exit nodes, as it appears illegal communications are their responsibility. It's assumed that all TOR exit nodes are operated by state actors now, which lowers the overall security of the system. For this reason attributions about content must be transparent and accurate.\n\nAttributing Illegal Content\n\nMuch like above, attack vectors exist where infrastructure operators could intercept uploads and replace those files with bad content. Wrecking trust in the network at every layer.\n\nFalse Meta Data\n\nSaying a 5MB file is 5GB to recieve a large payout.\n\nDoublespend vectors\n\nA multiparty transaction must be in a defined channel or be cleared at every step. Some solutions would require additional and unwanted user interaction... but this can't come at the expense of safety.\n\nSolutions\n\nWe have a complicated multi-party transaction which means several parties have differing incentives. Front-Ends like 3speak will want to pay for content storage up front to enable their users to use their platform free of charge. Users will want to upload videos. Validators will want to verify videos to receive payment. Storage nodes will want to be verified to receive payment as well. Additionally encorders, indexers, and possibly other services like turn servers and p2p-coordination for chat features on live streams will need effective methods of trust.\n\nTo this effect I've been building state channels for these transactions. For ease of understanding, this is like buying a bearer bond, it can be transferred, with reciepts attached... and if nobody bears the bond at it's expiration... the sale is canceled and the funds return to your account.\n\nChain of Custody\n\nIt's important to note that contracts have obligations for each involved party. Payment obligations, storage obligations, content obligations, and so on.\n\nLet's go through probably the most complicated type of transaction.\n\n3speak.tv Content Upload\n\nA user, Alice, logs into 3speak. When the user goes to upload a file they can trigger API that causes 3speak to promise a Broca amount for the storage of user generated content. 3speak has a vote bot and will be able to cover a certain amount of broca value even by voting on different content. To this effect a tally can be maintained and a true market condition might allow the average person to upload a 1GB file. So 3speak will initiate a state channel for Alice with a promise to pay for up to 1GB of content for Alice.\n\nSPK network API can show the valid state of this channel with a certain expiration. The nodes in SPK network will debit the appropriate resources from their account and place them into this channel. This channel will include authorized upload nodes. They can also set a few expiration times for this. For example, if json_metadata in a hive post with rewards enabled and a benificary set to 3speak doesn't occur in 24 hours the remainder of the contract can be voided, and the resources returned to 3speak to aid another user to upload their content.\n\nAs Alice is uploading her files her computer will checksum the video, and sign the state channel contract with the additional checksum, and file byte size. This will prevent the upload node, Bob, from attributing other files to Alice's contract.\n\nAs Bob insert's Alice's video into IPFS, a hash is generated and Bob can sign the state channel contract that includes the IPFS locator hash and the checksum/size as verified. Bob can hold this contract to submit it to receive the 24 hour periods Broca to cover the cost of the file upload bandwidth/processor cycles. In the future, when Alice wants to upload a video the allocation might be a little smaller, or non-existent... as Bob might actually be Alice and trying to game rewards for non-content; Or the state channel may no longer list Bob as a upload provider.\n\nThis video may include some headers to have encoder nodes handle the video for online playback. When the file is ready to be distributed these contract and it's various signatures can be included as json_metadata and reconciled in the system. Building the contracts necessary to store files in the network and have them aperiodically checked via previously discussed psuedorandom algorithms.\n\nFinalization", "articleBodyHtml": "
\n\n

Proof of Access

\n\n

It's day 8 of blog everyday November, things have calmed down from moving and I'm back on my grind

\n\n

The Hard Part

\n\n

A little while ago we posted a video of a few of us talking shop and reviewing some code. I mentioned some of the \"hard part\" was around the resource credit nature of BROCA tokens, and the handshake necessary to put content into the network. So here we are a few months later and a whole lot of thought, research, and prototyping later. It's time to talk through \"Content Insertion.\"

\n\n
\"craiyon_164319_file_upload_spy_v_spy.png\"
\n\n

Mitigating Abuse

\n\n

Our Solution to this problem has to mitigate several abuse vectors. Let's list and explain them.

\n\n\n\n
Placing Illegal Content
\n\n

People are not likely to want to run infrastructure that can directly get them into trouble. The TOR network has this issue with exit nodes, as it appears illegal communications are their responsibility. It's assumed that all TOR exit nodes are operated by state actors now, which lowers the overall security of the system. For this reason attributions about content must be transparent and accurate.

\n\n
Attributing Illegal Content
\n\n

Much like above, attack vectors exist where infrastructure operators could intercept uploads and replace those files with bad content. Wrecking trust in the network at every layer.

\n\n
False Meta Data
\n\n

Saying a 5MB file is 5GB to recieve a large payout.

\n\n
Doublespend vectors
\n\n

A multiparty transaction must be in a defined channel or be cleared at every step. Some solutions would require additional and unwanted user interaction... but this can't come at the expense of safety.

\n\n

Solutions

\n\n

We have a complicated multi-party transaction which means several parties have differing incentives. Front-Ends like 3speak will want to pay for content storage up front to enable their users to use their platform free of charge. Users will want to upload videos. Validators will want to verify videos to receive payment. Storage nodes will want to be verified to receive payment as well. Additionally encorders, indexers, and possibly other services like turn servers and p2p-coordination for chat features on live streams will need effective methods of trust.

\n\n

To this effect I've been building state channels for these transactions. For ease of understanding, this is like buying a bearer bond, it can be transferred, with reciepts attached... and if nobody bears the bond at it's expiration... the sale is canceled and the funds return to your account.

\n\n
Chain of Custody
\n\n

It's important to note that contracts have obligations for each involved party. Payment obligations, storage obligations, content obligations, and so on.

\n\n

Let's go through probably the most complicated type of transaction.

\n\n
3speak.tv Content Upload
\n\n

A user, Alice, logs into 3speak. When the user goes to upload a file they can trigger API that causes 3speak to promise a Broca amount for the storage of user generated content. 3speak has a vote bot and will be able to cover a certain amount of broca value even by voting on different content. To this effect a tally can be maintained and a true market condition might allow the average person to upload a 1GB file. So 3speak will initiate a state channel for Alice with a promise to pay for up to 1GB of content for Alice.

\n\n

SPK network API can show the valid state of this channel with a certain expiration. The nodes in SPK network will debit the appropriate resources from their account and place them into this channel. This channel will include authorized upload nodes. They can also set a few expiration times for this. For example, if json_metadata in a hive post with rewards enabled and a benificary set to 3speak doesn't occur in 24 hours the remainder of the contract can be voided, and the resources returned to 3speak to aid another user to upload their content.

\n\n

As Alice is uploading her files her computer will checksum the video, and sign the state channel contract with the additional checksum, and file byte size. This will prevent the upload node, Bob, from attributing other files to Alice's contract.

\n\n

As Bob insert's Alice's video into IPFS, a hash is generated and Bob can sign the state channel contract that includes the IPFS locator hash and the checksum/size as verified. Bob can hold this contract to submit it to receive the 24 hour periods Broca to cover the cost of the file upload bandwidth/processor cycles. In the future, when Alice wants to upload a video the allocation might be a little smaller, or non-existent... as Bob might actually be Alice and trying to game rewards for non-content; Or the state channel may no longer list Bob as a upload provider.

\n\n

This video may include some headers to have encoder nodes handle the video for online playback. When the file is ready to be distributed these contract and it's various signatures can be included as json_metadata and reconciled in the system. Building the contracts necessary to store files in the network and have them aperiodically checked via previously discussed psuedorandom algorithms.

\n\n

Finalization

\n\n
", "canonicalUrl": "https://peakd.com/dev/@disregardfiat/proof-of-access-content-insertion"},{"url": "https://hive.blog/blog/@disregardfiat/coffee-coffee", "probability": 0.9265394, "headline": "Coffee Coffee", "datePublished": "2022-10-24T04:24:09.016317", "datePublishedRaw": "6 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23yd3nKTe8gzPQabEkiMdCLdnVeUACa1z7XZ6UDVLLBjQ3tmkaCmUQiWRVJJzVxwinatR.jpg", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23z79K9YkJ5aTE4ife2vBtwuYhZwFpFttbY2BhUc96Fc5kogAgQsyFKdLuz6WoRBQxiV3.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23zRxsTmfVSUohYzD37ytMND7dWrFPuh9pZ26skqS5Rp9ktJSK939mxDePxdEzTFwurPP.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23yd3nKTe8gzPQabEkiMdCLdnVeUACa1z7XZ6UDVLLBjQ3tmkaCmUQiWRVJJzVxwinatR.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23ywsYq3ZBHysprQLtezfstnX7CpeydZmKGtiHJr5hssz7yLQ7zLCKkBNgWytRTJVeKmG.jpg"], "description": "Just making some coffee by disregardfiat", "articleBody": "Coffee\n\nDay 7 in NovemBlog\n\nIf you're ever in South America you'll find out that,outside of Buenos Aires, there really isn't a great coffee culture. Sure you can find \"coffee\" almost anywhere... but if you look a little closer you'll see some oddities. Supermarkets may not have ANY coffee beans. Finding a whole bean coffee is actually kind of a chore, most supermarkets are stocked completely with instant coffee. If you can find ground coffee be sure to read the labels, most are with sugar added to the coffee grounds.\n\nHere in Paraguay the caffeine culture item is Terer\u00e9. Terer\u00e9 is a yerba mate infusion with cold water, it's more common in Argentina to drink Mate with hot water, and even cold fruit juices over the cold water variety. Stores here have aisles dedicated to terer\u00e9 and all the herbs that are commonly added. It can be a little confusing walking thru a store and having 3 or more separate areas to buy spices. Terer\u00e9 spices, cooking spicing, baking spices, and possibly grilling spices. Nearly everybody you see carries around an insulated thermos to keep their water cold, as they add a little bit to their cup and drink it before it warms up throughout the day.\n\nBack to the coffee...\nI poured our two coffee cups worth of water into the pitcher to see how much water they contained. I normally add whole milk to coffee but some water will remain with the grounds. 617mL of water then goes into our electric kettle.\n\nWe like our coffee a little on the intense side so we went for a 1:18 coffee water ratio. 1:15-20 is common for a french press. By this time our water was almost boiling. \"Perfect\" coffee comes from a 93 degree steep... but there is heat losses to the grounds and the glass... so try to get the water a little hotter and find out what temperature is after about minute when you mix the water in. At the end of the day what works best for your taste is what you should do. No reason to make coffee not the way you like it.\n\nInstant coffee has to be dried to be stored, you loose a lot of oils and aromas. Looking at the bubbles after you stir the grounds with the water and you can see the rainbow colored bubbles that tell of flavorful oils and compounds that are usually missed. One thing I can't capture for you is how much better the smell is. This is the coffee you want to wake you up in the morning.\n\nAfter 4 minutes I press the filter through the pitcher and immediately serve the coffee. It's sweet, bright, and quite pleasant. I top off the cup with whole cold milk which brings the temperature down to be pleasant to drink. While I can appreciate a black coffee, it just doesn't quite hit the spot in the same way as a cup with a little cream, or whole milk.\n\nThis is the first time Camila has had french pressed coffee and she very much approves. \"It feels like real coffee. Bitter but no need for sugar. The smell is amazing, like perfume.\" I caught her smelling the pitcher after she finished her cup.", "articleBodyHtml": "
\n\n

Coffee

\n\n

Day 7 in NovemBlog

\n\n

If you're ever in South America you'll find out that,outside of Buenos Aires, there really isn't a great coffee culture. Sure you can find \"coffee\" almost anywhere... but if you look a little closer you'll see some oddities. Supermarkets may not have ANY coffee beans. Finding a whole bean coffee is actually kind of a chore, most supermarkets are stocked completely with instant coffee. If you can find ground coffee be sure to read the labels, most are with sugar added to the coffee grounds.

\n\n

Here in Paraguay the caffeine culture item is Terer\u00e9. Terer\u00e9 is a yerba mate infusion with cold water, it's more common in Argentina to drink Mate with hot water, and even cold fruit juices over the cold water variety. Stores here have aisles dedicated to terer\u00e9 and all the herbs that are commonly added. It can be a little confusing walking thru a store and having 3 or more separate areas to buy spices. Terer\u00e9 spices, cooking spicing, baking spices, and possibly grilling spices. Nearly everybody you see carries around an insulated thermos to keep their water cold, as they add a little bit to their cup and drink it before it warms up throughout the day.

\n\n

Back to the coffee...
\nI poured our two coffee cups worth of water into the pitcher to see how much water they contained. I normally add whole milk to coffee but some water will remain with the grounds. 617mL of water then goes into our electric kettle.
\n\"617mL

\n\n

We like our coffee a little on the intense side so we went for a 1:18 coffee water ratio. 1:15-20 is common for a french press. By this time our water was almost boiling. \"Perfect\" coffee comes from a 93 degree steep... but there is heat losses to the grounds and the glass... so try to get the water a little hotter and find out what temperature is after about minute when you mix the water in. At the end of the day what works best for your taste is what you should do. No reason to make coffee not the way you like it.

\n\n
\"34g
\n\n

Instant coffee has to be dried to be stored, you loose a lot of oils and aromas. Looking at the bubbles after you stir the grounds with the water and you can see the rainbow colored bubbles that tell of flavorful oils and compounds that are usually missed. One thing I can't capture for you is how much better the smell is. This is the coffee you want to wake you up in the morning.

\n\n
\"Oils\"
\n\n

After 4 minutes I press the filter through the pitcher and immediately serve the coffee. It's sweet, bright, and quite pleasant. I top off the cup with whole cold milk which brings the temperature down to be pleasant to drink. While I can appreciate a black coffee, it just doesn't quite hit the spot in the same way as a cup with a little cream, or whole milk.

\n\n
\"Finished
\n\n

This is the first time Camila has had french pressed coffee and she very much approves. \"It feels like real coffee. Bitter but no need for sugar. The smell is amazing, like perfume.\" I caught her smelling the pitcher after she finished her cup.

\n\n
", "canonicalUrl": "https://peakd.com/blog/@disregardfiat/coffee-coffee"},{"url": "https://hive.blog/life/@disregardfiat/sunday-funday", "probability": 0.9703214, "headline": "Sunday Funday", "datePublished": "2022-10-24T04:24:09.290781", "datePublishedRaw": "6 months ago", "author": "disregardfiat", "authorsList": ["disregardfiat"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23xpAn7XY3cnJaC3ZB3P1HvpPgqXFiD1rJN2nfp8fau7XE81sQx6WVWdVoGXtFm4XZDku.jpg", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23xpAn7XY3cnJaC3ZB3P1HvpPgqXFiD1rJN2nfp8fau7XE81sQx6WVWdVoGXtFm4XZDku.jpg"], "description": "Expat life in Paraguay by disregardfiat", "articleBody": "Expat Life\n\nIt's day 6 of blog everyday November. My last two posts are in Spanish and Portugese, the first one talked about my first day in a new house that I'm renting for a year. The second was a bunch of pictures my girlfriend and I took in Foz de Iguazu. Well... It's Sunday and a whole lot has come together here in my new house and I guess now is as good a time as any to talk about some really cool and weird things here.\n\nParaguay\n\nSo I decided to settle in Paraguay for a few reasons. They have 100% hydroelectric power, which means fuel shortages won't have a tremendous impact on my life, and electricity is super cheap... like 6 cents/KwH cheap. My house is about a kilometer from a paved road... but I have fiber internet(coming soon). The climate doesn't really get cold enough to snow, but 5-10 degrees(C) isn't too uncommon. It does get pretty hot here being nearly tropical. The import taxes on electronics and other things are fairly reasonable. Brazil you can expect to pay 100% import taxes and Argentina can be even worse with some things... Like the official price of a Nintendo Switch is over $1000, while here it's not much worse than shopping in a high tax city like Chicago. As for things that are locally sourced... it's roughly half the price.\n\nMy House\n\nIs a recently constructed 3 bedroom / 3 bathroom of about 150 sq meters(1650sqft). It's got an outdoor kitchen(quincho) which is pretty standard in this area... carne is almost always on the menu. It also has a automatic landscaping lights that are dusk-activated as well as electric fencing on the exterior walls (also fairly common). There is a pump to provide extra pressure for the city water as well as a few thousand liter tank in case the city supply goes offline for a couple of days(uncommon), but the water is a little hard. The water bill is something like $8 a month. Rent here is comfortably sub $1000/month.\n\nServices\n\nI have a gas stove, and a 10kg(22 pound) propane tank refill runs $15; Delivered. The cheapest fiber internet plan is 160MBpS and is $17/month. One thing my girlfriend (from Brazil) is really surprised by is cash on delivery for household items like home appliances and furniture. While one annoying thing is I can't update google maps, so my dirt road remains unnamed and I have to send a pin to everybody who needs to deliver items. But they are more than happy to come to the pin, drop off a TV and get a wad of cash. I have yet to find a delivery for a major appliance that costs any different than walking out of the store with an item. Most people here drive compact cars and I think this really helps the economy here function. Delivery for food on the other hand runs 0-$2... usually around 70cents.\n\nMarkets\n\nAny common items can be found here. Paraguay is kinda known as the shopping destination for Argentina and Brazil... Shopping malls near the border are like state line casinos in Nevada. The range of goods goes from generic to brand name. Some things I still haven't been able to find: the exact espresso machine I want or any immersion circulator(sous vide). In general things from Brazil are even cheaper than in Brazil. Electrolux brand appliances to food items, and even Petrobras gasoline. How a state owned oil corporation exports fuels cheaper than they sell it domestically is rather baffling to me... might have something to do with the hydro-electric dam and electricity exports to brazil... but who know? (yes this is an invitation to leave a comment)\n\nLong Term Plan\n\nI'd love to get things here settled enough to start building elsewhere. I need a drivers license and a car... a permanent residency etc... Then I'll feel comfortable enough to buy some land somewhere and build my own little slice of heaven somewhere.", "articleBodyHtml": "
\n\n

Expat Life

\n\n

It's day 6 of blog everyday November. My last two posts are in Spanish and Portugese, the first one talked about my first day in a new house that I'm renting for a year. The second was a bunch of pictures my girlfriend and I took in Foz de Iguazu. Well... It's Sunday and a whole lot has come together here in my new house and I guess now is as good a time as any to talk about some really cool and weird things here.

\n\n
\"Howdy\"
\n\n

Paraguay

\n\n

So I decided to settle in Paraguay for a few reasons. They have 100% hydroelectric power, which means fuel shortages won't have a tremendous impact on my life, and electricity is super cheap... like 6 cents/KwH cheap. My house is about a kilometer from a paved road... but I have fiber internet(coming soon). The climate doesn't really get cold enough to snow, but 5-10 degrees(C) isn't too uncommon. It does get pretty hot here being nearly tropical. The import taxes on electronics and other things are fairly reasonable. Brazil you can expect to pay 100% import taxes and Argentina can be even worse with some things... Like the official price of a Nintendo Switch is over $1000, while here it's not much worse than shopping in a high tax city like Chicago. As for things that are locally sourced... it's roughly half the price.

\n\n

My House

\n\n

Is a recently constructed 3 bedroom / 3 bathroom of about 150 sq meters(1650sqft). It's got an outdoor kitchen(quincho) which is pretty standard in this area... carne is almost always on the menu. It also has a automatic landscaping lights that are dusk-activated as well as electric fencing on the exterior walls (also fairly common). There is a pump to provide extra pressure for the city water as well as a few thousand liter tank in case the city supply goes offline for a couple of days(uncommon), but the water is a little hard. The water bill is something like $8 a month. Rent here is comfortably sub $1000/month.

\n\n

Services

\n\n

I have a gas stove, and a 10kg(22 pound) propane tank refill runs $15; Delivered. The cheapest fiber internet plan is 160MBpS and is $17/month. One thing my girlfriend (from Brazil) is really surprised by is cash on delivery for household items like home appliances and furniture. While one annoying thing is I can't update google maps, so my dirt road remains unnamed and I have to send a pin to everybody who needs to deliver items. But they are more than happy to come to the pin, drop off a TV and get a wad of cash. I have yet to find a delivery for a major appliance that costs any different than walking out of the store with an item. Most people here drive compact cars and I think this really helps the economy here function. Delivery for food on the other hand runs 0-$2... usually around 70cents.

\n\n

Markets

\n\n

Any common items can be found here. Paraguay is kinda known as the shopping destination for Argentina and Brazil... Shopping malls near the border are like state line casinos in Nevada. The range of goods goes from generic to brand name. Some things I still haven't been able to find: the exact espresso machine I want or any immersion circulator(sous vide). In general things from Brazil are even cheaper than in Brazil. Electrolux brand appliances to food items, and even Petrobras gasoline. How a state owned oil corporation exports fuels cheaper than they sell it domestically is rather baffling to me... might have something to do with the hydro-electric dam and electricity exports to brazil... but who know? (yes this is an invitation to leave a comment)

\n\n

Long Term Plan

\n\n

I'd love to get things here settled enough to start building elsewhere. I need a drivers license and a car... a permanent residency etc... Then I'll feel comfortable enough to buy some land somewhere and build my own little slice of heaven somewhere.

\n\n
", "canonicalUrl": "https://peakd.com/life/@disregardfiat/sunday-funday"},{"url": "https://hive.blog/hive-102201/@hivebuzz/wc2022", "probability": 0.95703787, "headline": "HiveBuzz World Cup Contest - Collect badges and win prizes - More than 7000 HIVE to win", "datePublished": "2022-11-24T04:24:08.756319", "datePublishedRaw": "5 months ago", "author": "hivebuzz", "authorsList": ["hivebuzz"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://i.imgur.com/lj0aF9Y.png", "images": ["https://images.hive.blog/768x0/https://i.imgur.com/lj0aF9Y.png", "https://images.hive.blog/768x0/https://i.imgur.com/yQJrKIe.png", "https://images.hive.blog/768x0/https://i.imgur.com/JWbM1Co.png", "https://images.hive.blog/768x0/https://i.imgur.com/U1xwKbC.png", "https://images.hive.blog/768x0/https://i.imgur.com/XYJ5G30.png", "https://images.hive.blog/DQmSWfbie9MTC172sENiA16bsMaz1ofT6AAyTo1ishasrcX/winexcomment.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/zottone444/23t7AyKqAfdxKEJPQrpePMW15BCPhbyrf5VoHWxhBFcEcPLjDUVVQAh9ZAopbmoJDekS6.png", "https://images.hive.blog/768x0/https://i.imgur.com/e62eHrt.png", "https://images.hive.blog/768x0/https://arcange.eu/images/flags/es_16.png", "https://images.hive.blog/32x32/https://images.hive.blog/u/hivebuzz/avatar"], "description": "Participate in the Hivebuzz World Cup Contest and try your luck to win lots of prizes. by hivebuzz", "articleBody": "Dieser Beitrag ist ins Deutsche \u00fcbersetzt: hier\nEste post est\u00e1 traducido al espa\u00f1ol - aqu\u00ed\nUne version en fran\u00e7ais de ce post est disponible - ici\n\nThe World Cup is a global event that brings together people from all over the world around a sporting competition well known to all.\n\nIt only takes place every 4 years and we didn't want to miss the opportunity to have fun all together by organizing our own competition in which all members of the community can participate.\n\nTherefore, the @hivebuzz team has prepared a fun contest that will last for the whole duration of the World Cup and will allow you to collect badges and maybe win some HIVE or other prizes!\n\nThe contest is open to everyone, whether you're a football fan or not, and we've made the entry process and rules as simple as possible.\n\nCheck the Word Cup 2022 tab on your board\n\nGo to and type your name to access your board.\n\nYou will notice there is a new Word Cup 2022 tab which contains a badge for each match of the competition.\n\nYour goal is to collect as many badges as possible among the 64 available!\n\nHow to participate?\n\n1. Register for the contest\n\nYou need to register for the contest by sending 1 HIVE to @hivebuzz.pool with the memo worldcup2022.\nThis is a one-time registration to avoid the contest being screwed up by bots. It will be added to the prize pool so you might get it back at the end of the contest if you are lucky or smart enough ;)\n\nYou can enter the competition at any time and the sooner you register and play the more chances you have of winning a bigger prize.\n\n2. Make your bet for each match\n\nEach World Cup game is represented by a badge on your board. You can click on it to see information about the match, including when it starts.\n\nNote the presence of the red lock which indicates that bets for this match are not yet open.\n\n@hivebuzz will publish a post for each match and unlock the badge 24 hours before the match begins.\n\nFrom this moment, bets will be open and you can place yours.\n\nUnder each post, @hivebuzz will create up to 3 comments like:\n\nTeam \"A\" win\nTeam \"B\" win\nTie\n\nPS: The 3rd comment (TIE) will only be present during the group phase.\n\nWhat you need to do is to cast a vote on the comment that corresponds to the result you expect for the match.\n\nReplying to any of the above-mentioned comments will be considered a fault. You will receive a red card (downvote) and you will be excluded from the match, meaning you will not get your badge!\n\nExample:\n\nEach time you successfully guessed a match result, the related badge on your board will light up!\n\nWhat reward can you receive?\n\nThe prize pool will be split among participants using the following rules:\n\n1st place: 20% of the prize pool\n2nd place: 10% of the prize pool\n3rd place: 5% of the prize pool\n\nThe rest of the prize pool will be distributed among all the participants in proportion to the number of badges they collected and the total of successful bets from the participants.\n\nExample:\nAll participants (except the first 3 winners) collected 1000 badges\nYou have successfully guessed 34 match results.\nYou will receive 34/1000th of the rest of the prize pool.\n\nHow are the winners determined?\n\nWhen the competition is over, we will count the number of badges collected by each participant. The more badges you have collected, the higher your place in the ranking.\n\nIn the event of a tie between the top players, the winner will be the one who first collected all his badges. This means that badges obtained during the group stage will weigh more than those in the final.\n\nExample:\n\nPlayer A collected 10 badges during the group phase\nPlayer B collected 9 badges during the group phase and 1 for the final\nTherefore Player A will be the winner.\n\nIf there is still a tie after applying the previous rule, the guess time (upvote) of the players will decide, the winner being the first to have made their guess (upvote).\n\nContest Global Rules\n\nFor each match:\n\nVotes cast after the scheduled start time of the match will be rejected.\n\nIMPORTANT:\n\nHiveBuzz will use UTC date and time to validate when you cast your vote. Therefore, check your timezone and convert your local time to UTC before voting!\n\nThank you to our sponsors\n\nThe prize pool would not have reached the amount of more than 6500 HIVE without the generosity of our sponsors:\n\nWe also have sponsors who make a contribution other than directly adding HIVE to the prize pool:\n\n@ocd\n\nPost about the world cup in the World Cup 2022 community\nSet @hivebuzz.pool as a 50% beneficiary\nIf you follow the above rules, @ocd will upvote your post. It's all beneficial for everyone since you receive a bigger payout and the possibility of having a bigger prize at the end of the contest as half of the author's reward goes into the prize pool.\n\n@cryptoshots.nft\n\n4000 DOOM gaming tokens that will be distributed in the same way as the HIVE prize pool.\nA Mythic Packs (valued at $26 each) for the top 3 winners and a Starter Packs (valued at $4.5 each) for the next 10 winners.\n\n@ecency\n\n50,000 Ecency Points that will be distributed in the same way as the HIVE prize pool. You can them for tipping/transferring, promoting and boosting content on Ecency.\n\n@dcrops\n\n30,000 CROP tokens will be distributed in the same way as the HIVE prize pool.\n\n100 dCrops BETA edition packs (each priced at $3) will be distributed among the top 93 participants as follows\n1st place: 5 packs\n2nd place: 3 packs\n3rd place: 2 packs\n4th to 93th place: 1 pack\nEach pack contains 3 NFTs which you can use to play dCrops and earn CROP tokens.\n\n@infernalcoliseum\n\n250 SOULS tokens will be distributed in the same way as the HIVE prize pool.\n\n245 Infernal Coliseum V2 packs (each valued on average $1.5) will be distributed among the top 200 participants as follows:\n1st place: 10 packs\n2nd place: 9 packs\n3rd place: 8 packs\n4th place: 7 packs\n5th place: 6 packs\n6th place: 5 packs\n7th place: 4 packs\n8th place: 3 packs\n9th place: 2 packs\n10th to 200th place: 1 pack\n\n@hashkings\n\n150 avatar packs from their second set (valued at $2 each). They are collectible and utility NFTs that work in all their ecosystem. Avatars packs will be distributed as follows:\n1st place: 5 packs\n2nd place: 3 packs\n3rd place: 2 packs\n4th to 136th place: 1 pack\n\n100 FACTORIES from the Farming Wars presale, each is sold at $1.25. FACTORIES will be distributed as follows:\n1 FACTORY for each top 100 players\n\n@sportstalksocial\n\n10,000,000 SPORTS tokens will be distributed in the same way as the HIVE prize pool.\n\nMany of our sponsors are witnesses too, therefore kindly consider supporting them if you have a spare vote!", "articleBodyHtml": "
\n\n

\n

The World Cup 2022 will kick off soon!

\n\n

Dieser Beitrag ist ins Deutsche \u00fcbersetzt: hier
\n Este post est\u00e1 traducido al espa\u00f1ol - aqu\u00ed
\n Une version en fran\u00e7ais de ce post est disponible - ici

\n\n

The World Cup is a global event that brings together people from all over the world around a sporting competition well known to all.

\n\n

It only takes place every 4 years and we didn't want to miss the opportunity to have fun all together by organizing our own competition in which all members of the community can participate.

\n\n

Therefore, the @hivebuzz team has prepared a fun contest that will last for the whole duration of the World Cup and will allow you to collect badges and maybe win some HIVE or other prizes!

\n\n

The contest is open to everyone, whether you're a football fan or not, and we've made the entry process and rules as simple as possible.

\n\n

Check the Word Cup 2022 tab on your board

\n\n

Go to and type your name to access your board.

\n\n

You will notice there is a new Word Cup 2022 tab which contains a badge for each match of the competition.

\n\n
\n\n

Your goal is to collect as many badges as possible among the 64 available!

\n\n

How to participate?

\n\n

1. Register for the contest

\n\n

You need to register for the contest by sending 1 HIVE to @hivebuzz.pool with the memo worldcup2022.
\nThis is a one-time registration to avoid the contest being screwed up by bots. It will be added to the prize pool so you might get it back at the end of the contest if you are lucky or smart enough ;)

\n\n

You can enter the competition at any time and the sooner you register and play the more chances you have of winning a bigger prize.

\n\n

2. Make your bet for each match

\n\n

Each World Cup game is represented by a badge on your board. You can click on it to see information about the match, including when it starts.

\n\n
\n\n

Note the presence of the red lock which indicates that bets for this match are not yet open.

\n\n

@hivebuzz will publish a post for each match and unlock the badge 24 hours before the match begins.

\n\n
\n\n

From this moment, bets will be open and you can place yours.

\n\n

Under each post, @hivebuzz will create up to 3 comments like:

\n\n
  1. Team \"A\" win
  2. \n
  3. Team \"B\" win
  4. \n
  5. Tie
\n\n

PS: The 3rd comment (TIE) will only be present during the group phase.

\n\n

What you need to do is to cast a vote on the comment that corresponds to the result you expect for the match.

\n\n
\n\n

Replying to any of the above-mentioned comments will be considered a fault. You will receive ared card (downvote) and you will be excluded from the match, meaning you will not get your badge!\n

\n\n

Example:

\n\n

Each time you successfully guessed a match result, the related badge on your board will light up!

\n\n
\n\n

What reward can you receive?

\n\n

The prize pool will be split among participants using the following rules:

\n\n

1st place: 20% of the prize pool
\n2nd place: 10% of the prize pool
\n3rd place: 5% of the prize pool

\n\n

The rest of the prize pool will be distributed among all the participants in proportion to the number of badges they collected and the total of successful bets from the participants.

\n\n

Example:
\nAll participants (except the first 3 winners) collected 1000 badges
\nYou have successfully guessed 34 match results.
\nYou will receive 34/1000th of the rest of the prize pool.

\n\n

How are the winners determined?

\n\n

When the competition is over, we will count the number of badges collected by each participant. The more badges you have collected, the higher your place in the ranking.

\n\n

In the event of a tie between the top players, the winner will be the one who first collected all his badges. This means that badges obtained during the group stage will weigh more than those in the final.

\n\n

Example:

\n\n\n\n

If there is still a tie after applying the previous rule, the guess time (upvote) of the players will decide, the winner being the first to have made their guess (upvote).

\n\n

Contest Global Rules

\n\n\n\n\n\n

IMPORTANT:

\n\n

HiveBuzz will useUTC date and time to validate when you cast your vote. Therefore, check your timezone and convert your local time to UTC before voting!

\n\n

Thank you to our sponsors

\n\n
\n\n

The prize pool would not have reached the amount of more than 6500 HIVE without the generosity of our sponsors:

\n\n

We also have sponsors who make a contribution other than directly adding HIVE to the prize pool:

\n\n

@ocd

\n\n\n\n

@cryptoshots.nft

\n\n\n\n

@ecency

\n\n\n\n

@dcrops

\n\n\n\n

@infernalcoliseum

\n\n\n\n

@hashkings

\n\n\n\n

@sportstalksocial

\n\n\n\n


Many of our sponsors are witnesses too, therefore kindly consider supporting them if you have a spare vote!

\n\n
", "canonicalUrl": "https://peakd.com/hive-102201/@hivebuzz/wc2022"},{"url": "https://hive.blog/vida/@disregardfiat/mi-primero-dia-en-una-nueva-casa", "probability": 0.8556069, "headline": "Mi Primero D\u00eda en Una Nueva Casa", "datePublished": "2022-10-24T04:24:11.376310", "datePublishedRaw": "6 months ago", "inLanguage": "es", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/EpxrjLnDeRdGt3WLbVd2AqJKDuXifczM3Udxp3fzRbzHDY9so7V1NLdpJ4BEsJ84WMD.jpg", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23zS7UQTyfTrugjFruS6E2g7MTGcULC7QZECvynCwH6WR392hPqd2iNWPdkyxZny2EBZt.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/EpxrjLnDeRdGt3WLbVd2AqJKDuXifczM3Udxp3fzRbzHDY9so7V1NLdpJ4BEsJ84WMD.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/Eq5FuXUEzH4nnoT26tDrWrKSkfyF5GqEFeKAJarFi43DeTLTHPYPqMNjGRRJ9jUwTMe.jpg"], "description": "Hola de una casa vac\u00edo. by disregardfiat", "articleBody": "D\u00eda 4 en Novembre de Blogs\n\nHoy estamos esperando que nos entreguen algunos muebles en nuestro nuevo hogar. Si bien solo tener una mochila es divertido por un tiempo, finalmente podr\u00e9 comprar cosas para una cocina y disfrutar\u00e9 preparando comidas nuevamente.\n\nSolo tengo pocas cosas aqui mismo, pero un hervidor el\u00e9ctrico es suficiente para un caf\u00e9.\n\nObvio ya tengo mi computadora comigo, pero sin un bueno lugar para ella. Entonces, ve mi escritorio de pie improvisado. Nunca estoy demasiado lejos de ella si necesito solucionar un problema.", "articleBodyHtml": "
\n\n

D\u00eda 4 en Novembre de Blogs

\n\n

Hoy estamos esperando que nos entreguen algunos muebles en nuestro nuevo hogar. Si bien solo tener una mochila es divertido por un tiempo, finalmente podr\u00e9 comprar cosas para una cocina y disfrutar\u00e9 preparando comidas nuevamente.

\n\n
\"Caf\u00e9
\n\n

Solo tengo pocas cosas aqui mismo, pero un hervidor el\u00e9ctrico es suficiente para un caf\u00e9.

\n\n
\"\u00bfun
\n\n

Obvio ya tengo mi computadora comigo, pero sin un bueno lugar para ella. Entonces, ve mi escritorio de pie improvisado. Nunca estoy demasiado lejos de ella si necesito solucionar un problema.

\n\n
\"Quincho\"
\n\n
", "canonicalUrl": "https://peakd.com/vida/@disregardfiat/mi-primero-dia-en-una-nueva-casa"},{"url": "https://hive.blog/photos/@disregardfiat/foz-de-iguacu", "probability": 0.8216803, "headline": "Foz de Igua\u00e7u", "datePublished": "2022-10-24T04:24:13.676795", "datePublishedRaw": "6 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/EonmzzmxN3osD26c7DvDXdfcBFoXYZ1esn2bvGitGGWYm26bNagsZdq3i4oTy8Muq4C.jpg", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23r1eQLy7aTsLp8iD1hC7cnu9u9pb22PqLRSNtuj5zynnwAEwRaddASsWdkSiPX8TU3nt.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23uFRhwsFe58g8gEDm36zwbLVy2J44ZM4uRYat5i9mUzyacbX7zBdykMXp5vGXu2EdvMg.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/EpJY9vmYGrmitMaJC5y4WjZRGHKbezojz67VuuvQ3gZruWvaMmXMmznrdGgdg43TTWm.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/EqWkpcmGUcqBjkqrPctCn2CriK9rT9dW3NcwjerPJBdNxTBGG4kt2AzqzMnDXjCvedd.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23wqpjefR2PjmVSAQxa1gfZwXJE4M51V8XxGxz8Ws7AMmjMFVAfjWDNeLhRm8C8rr6Lpk.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/EonmzzmxN3osD26c7DvDXdfcBFoXYZ1esn2bvGitGGWYm26bNagsZdq3i4oTy8Muq4C.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/EnqSfSZ8SJueNqyJQ42t34n3dK3G1zW59pFwLWDkV9QRWchXYqdUKnwHmKKvcXAf33e.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/245cYjGBT9b6sVfDZbk7fY37xQMB7kg3NnHds6aKnqrtANU7LKBv33MqDZ7vx4nB5W2eN.jpg"], "description": "Fotos do Foz de Igua\u00e7u by disregardfiat", "articleBody": "H\u00e1 um tempo atr\u00e1s n\u00f3s visitamos Foz de Igua\u00e7u. Foi o primeiro dia de inverno.\nEste \u00e9 o Parque Nacional de Foz do Igua\u00e7u, ele tem uma liga\u00e7\u00e3o fronteiri\u00e7a entre tr\u00eas pa\u00edses da Am\u00e9rica do Sul, Brasil (onde essas fotos foram tiradas), Paraguai e Argentina.\nO nome do parque \"Igua\u00e7u\" ou \"Iguazu\" (em espanhol) vem do guarani, l\u00edngua falada pelos povos nativos de algumas regi\u00f5es da Am\u00e9rica do Sul e significa \"\u00e1gua grande\".\nFoz do Igua\u00e7u possui as maiores cachoeiras do mundo em volume de \u00e1gua, pois possui 275 cachoeiras.\nEsta regi\u00e3o mesmo no outono e inverno \u00e9 muito quente e \u00famida, ent\u00e3o quando a corrente de \u00e1gua nos atingiu enquanto caminh\u00e1vamos pelas passarelas do parque, foi um pouco refrescante.\nO parque tem muitos animais nativos da fauna sul-americana, por\u00e9m, s\u00f3 vimos os quatis, que estavam por toda parte, mas foi dif\u00edcil tirar uma foto n\u00edtida, pois eles sempre andavam muito r\u00e1pido.\n\nN\u00f3s tiramos poucas fotos para compartilhar.", "articleBodyHtml": "
\n\n

H\u00e1 um tempo atr\u00e1s n\u00f3s visitamos Foz de Igua\u00e7u. Foi o primeiro dia de inverno.
\nEste \u00e9 o Parque Nacional de Foz do Igua\u00e7u, ele tem uma liga\u00e7\u00e3o fronteiri\u00e7a entre tr\u00eas pa\u00edses da Am\u00e9rica do Sul, Brasil (onde essas fotos foram tiradas), Paraguai e Argentina.
\nO nome do parque \"Igua\u00e7u\" ou \"Iguazu\" (em espanhol) vem do guarani, l\u00edngua falada pelos povos nativos de algumas regi\u00f5es da Am\u00e9rica do Sul e significa \"\u00e1gua grande\".
\nFoz do Igua\u00e7u possui as maiores cachoeiras do mundo em volume de \u00e1gua, pois possui 275 cachoeiras.
\nEsta regi\u00e3o mesmo no outono e inverno \u00e9 muito quente e \u00famida, ent\u00e3o quando a corrente de \u00e1gua nos atingiu enquanto caminh\u00e1vamos pelas passarelas do parque, foi um pouco refrescante.
\nO parque tem muitos animais nativos da fauna sul-americana, por\u00e9m, s\u00f3 vimos os quatis, que estavam por toda parte, mas foi dif\u00edcil tirar uma foto n\u00edtida, pois eles sempre andavam muito r\u00e1pido.

\n\n

N\u00f3s tiramos poucas fotos para compartilhar.
\n\"Arco-\u00edris

\n\n
\"Panorama\"
\n\n
\"Uma
\n\n
\"A
\n\n
\"A
\n\n
\"Arco-\u00edris
\n\n
\"Quati\"
\n\n
\"Uma
\n\n
", "canonicalUrl": "https://peakd.com/photos/@disregardfiat/foz-de-iguacu"},{"url": "https://hive.blog/defi/@disregardfiat/the-multi-signature-dao", "probability": 0.9406851, "headline": "The Multi-Signature DAO", "datePublished": "2022-10-24T04:24:13.950978", "datePublishedRaw": "6 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/Eo6QGD8R92Up7jf3k6sZjhpUECh5v2w2WGNShtcVgDzYey8bn6qVeLhkk8PpTadM7ng.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/Eo6QGD8R92Up7jf3k6sZjhpUECh5v2w2WGNShtcVgDzYey8bn6qVeLhkk8PpTadM7ng.png"], "description": "Multi-Sigs and DAO. Are they friends or foe? by disregardfiat", "articleBody": "Hive with HoneyComb\n\nDay 3 of NovemBlog\n\nToday I thought I'd try a little article review and rebuttal.\n\nFor those that don't want to click through let's summarize the points first.\n\nAll DAOs and Web3 projects must manage capital\nDAOs and Multi-Sig are more decentralized than DAOs without\nMore signatures - more coordination\nMore wallets - more convenience\n\nAll DAOs and Web3 Projects Must Manage Capital\n\nIt's true, even here on Hive. We have some projects that need very little funds management, like PeakD for example. If they lost their keys they'd have to dig deeper for server bills... but they would be able to recover their services by pointing to new funding accounts and asking all of us to change our DHF votes.\n\nToward the middle of the spectrum we have community vote bots. Hive has some tremendous advantages toward this area. With delegation and content voting this can be done almost with out risk, worst case you lose about 5 days of votes.\n\nAt the opposite end we have layer 2 token architectures like Hive-Engine and HoneyComb. These use different approaches to collect and send funds. In most senses these are both DAOs, and HoneyComb builds a multi-signature wallet around it's DEX account on Hive.\n\nDAOs and Multi-Sig are more decentralized than DAOs without\n\nGerald Cotten \u2014 who was the sole possessor of the cryptographic keys to the QuadrigaCX exchange wallet \u2014 died and left funds worth $198,435,000 in an unrecoverable state. Tornado Cash Multi-Signature Key holders coordinated to return unspent ETH.\n\nThis particularly sad incident isn't anywhere near as common as a rugpull.\n\nWhatever the opposite of a rugpull is happened when Tornado cash developers relinquished multi-signature control of a community funding account to the DAO itself. Basically stopping a petty cash account after sanctions from the US government.\n\nMore or Less is More\n\nFinally I want to bring this back to Hive and my HoneyComb Software. HoneyComb autonomously builds and signs multi-signature accounts and transactions. This is required on Hive as we lack a virtual machine / smart contract layer. It is at it's heart a Proof of Stake system with some price controls that that keep the Wallet small enough (through refunds of open orders) that hope to prevent a profitable situation arises that would benefit a theft. If coordination occurs, the community should be able to completely refund the multi-sig wallet by buying the Stake from the collaborators and places those funds in a new wallet.\n\nHoneyComb is autonomous which makes proper coordination as easy as running the software. There still isn't Multi-Signing tools that are user friendly here. Passing transactions around with signatures is a very technical process, and it all has to be done in under an hour.\n\nWe do run into some hard coded limits here. For example with a cap of 40 signatories in an account the Proof-of-Stake method will only allow as much capital as 40 people are willing to put forth. \"sharding\" into smaller accounts would allow several more people to pool their resources and provide more possible liquidity safely.\n\nSPK network will likely benefit from this arrangement with 2 wallets, one for the LARYNX pair and one for the SPK pair.", "articleBodyHtml": "
\n\n

Hive with HoneyComb

\n\n

Day 3 of NovemBlog

\n\n
\"crAIyon
\n\n

Today I thought I'd try a little article review and rebuttal.

\n\n

For those that don't want to click through let's summarize the points first.

\n\n\n\n

All DAOs and Web3 Projects Must Manage Capital

\n\n

It's true, even here on Hive. We have some projects that need very little funds management, like PeakD for example. If they lost their keys they'd have to dig deeper for server bills... but they would be able to recover their services by pointing to new funding accounts and asking all of us to change our DHF votes.

\n\n

Toward the middle of the spectrum we have community vote bots. Hive has some tremendous advantages toward this area. With delegation and content voting this can be done almost with out risk, worst case you lose about 5 days of votes.

\n\n

At the opposite end we have layer 2 token architectures like Hive-Engine and HoneyComb. These use different approaches to collect and send funds. In most senses these are both DAOs, and HoneyComb builds a multi-signature wallet around it's DEX account on Hive.

\n\n

DAOs and Multi-Sig are more decentralized than DAOs without

\n\n

Gerald Cotten \u2014 who was the sole possessor of the cryptographic keys to the QuadrigaCX exchange wallet \u2014 died and left funds worth $198,435,000 in an unrecoverable state. Tornado Cash Multi-Signature Key holders coordinated to return unspent ETH.

\n\n

This particularly sad incident isn't anywhere near as common as a rugpull.

\n\n

Whatever the opposite of a rugpull is happened when Tornado cash developers relinquished multi-signature control of a community funding account to the DAO itself. Basically stopping a petty cash account after sanctions from the US government.

\n\n

More or Less is More

\n\n

Finally I want to bring this back to Hive and my HoneyComb Software. HoneyComb autonomously builds and signs multi-signature accounts and transactions. This is required on Hive as we lack a virtual machine / smart contract layer. It is at it's heart a Proof of Stake system with some price controls that that keep the Wallet small enough (through refunds of open orders) that hope to prevent a profitable situation arises that would benefit a theft. If coordination occurs, the community should be able to completely refund the multi-sig wallet by buying the Stake from the collaborators and places those funds in a new wallet.

\n\n

HoneyComb is autonomous which makes proper coordination as easy as running the software. There still isn't Multi-Signing tools that are user friendly here. Passing transactions around with signatures is a very technical process, and it all has to be done in under an hour.

\n\n

We do run into some hard coded limits here. For example with a cap of 40 signatories in an account the Proof-of-Stake method will only allow as much capital as 40 people are willing to put forth. \"sharding\" into smaller accounts would allow several more people to pool their resources and provide more possible liquidity safely.

\n\n

SPK network will likely benefit from this arrangement with 2 wallets, one for the LARYNX pair and one for the SPK pair.

\n\n
", "canonicalUrl": "https://peakd.com/defi/@disregardfiat/the-multi-signature-dao"},{"url": "https://hive.blog/steemit/@digitalnotvir/how-reputation-scores-are-calculated-the-details-explained-with-simple-math", "probability": 0.8972477, "headline": "How reputation scores are calculated - the details explained with simple math", "datePublished": "2016-04-24T04:24:33.743563", "datePublishedRaw": "7 years ago", "author": "digitalnotvir", "authorsList": ["digitalnotvir"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/http://i.imgur.com/ZIr6YeS.png", "images": ["https://images.hive.blog/768x0/http://i.imgur.com/ZIr6YeS.png"], "description": "I couldn't find a really good explanation of reputation, so I read the code, and here's what I discovered. Simplified vs. Raw scores The score displayed on your Steemit\u2026 by digitalnotvir", "articleBody": "I couldn't find a really good explanation of reputation, so I read the code, and here's what I discovered.\n\nSimplified vs. Raw scores\n\nThe score displayed on your Steemit profile is not the actual value stored in the Steem blockchain, but a simplified version. The scores on profile pages range from a minimum of roughly -25 to maximum of around 75, with newly-opened accounts placed exactly in the middle at 25. The raw scores on the blockchain, however, are actually values in the millions, billions, or even trillions (and these can also be positive or negative).\n\nSimplified scores are shown on Steemit profiles, for example here: steemit.com/@username. Raw scores can be found on steemd.com/@username, note the slightly different domain names there. On steemd.com scroll to the bottom row of the first data box, where it says \"Reputation\", you'll likely see a number in the billions.\n\nHere comes the math\n\nNumbers in the billions are good for computers to do lots of complex math and fine tuning, but they're not easy for humans to read, hence the simplifed version shown on profile pages. Here is the formula for the simplification:\n\nTake the log base 10 of the raw score\nSubtract 9\nMultiply by 9\nAdd 25\nRound down to the nearest integer\n\nSo a raw score of 26,714,311,062 becomes a simplified score of 37.\n\nActually, it's a little more complicated than that, but this is a good summary. I'll leave the extra details out for now, because I want to keep this simple. Scores in the lower end are normalized to 25. It takes a little while for newly-registered users to move off 25, but once you finally get to 26 or above then the formula described above is valid.\n\nThis is a snippet of the actual code used by the system. It should make sense to most programmers:\n\nIf you want to read and decipher the exact math, see the code in GitHub on the following page. Look at the last section, beginning \"export const repLog10\".\n\nTracking your score increases\n\nAs mentioned, for new users the simplified reputation score is always 25, and it stays there for a while even if you post and comment a few times. Only once your raw score exceeds around 1,300,000,000 will you finally move up to a simplified score of 26. These are aproximate rounded numbers matching simplified scores to raw scores:\n\n26 = 1,300,000,000\n27 = 2,000,000,000\n29 = 3,000,000,000\n31 = 5,000,000,000\n34 = 10,000,000,000\n\nNote that buying Steem or Steem Power does not increase reputation. Only posting, commenting, and curating will increase your rep. So if you'd like to see your reputation increase (and thus your curation power) the secret is to get engaging! Post and comment with some high-quality and original content.\n\nFurther details\n\nWhile studying this, I also found the following post by @dantheman helpful. It doesn't quite cover the details I've described above, but it provides other useful info about how reputation works:\n\nAlso note, it's possible these reputation calculations could change at any time. So if you've Googled and are reading this some time in the future, take care to go back to that GitHub page and check the code again.", "articleBodyHtml": "
\n\n

I couldn't find a really good explanation of reputation, so I read the code, and here's what I discovered.

\n\n

Simplified vs. Raw scores

\n\n

The score displayed on your Steemit profile is not the actual value stored in the Steem blockchain, but a simplified version. The scores on profile pages range from a minimum of roughly -25 to maximum of around 75, with newly-opened accounts placed exactly in the middle at 25. The raw scores on the blockchain, however, are actually values in the millions, billions, or even trillions (and these can also be positive or negative).

\n\n

Simplified scores are shown on Steemit profiles, for example here: steemit.com/@username. Raw scores can be found on steemd.com/@username, note the slightly different domain names there. On steemd.com scroll to the bottom row of the first data box, where it says \"Reputation\", you'll likely see a number in the billions.

\n\n

Here comes the math

\n\n

Numbers in the billions are good for computers to do lots of complex math and fine tuning, but they're not easy for humans to read, hence the simplifed version shown on profile pages. Here is the formula for the simplification:

\n\n\n\n

So a raw score of 26,714,311,062 becomes a simplified score of 37.

\n\n

Actually, it's a little more complicated than that, but this is a good summary. I'll leave the extra details out for now, because I want to keep this simple. Scores in the lower end are normalized to 25. It takes a little while for newly-registered users to move off 25, but once you finally get to 26 or above then the formula described above is valid.

\n\n

This is a snippet of the actual code used by the system. It should make sense to most programmers:

\n\n
\n\n

If you want to read and decipher the exact math, see the code in GitHub on the following page. Look at the last section, beginning \"export const repLog10\".

\n\n

Tracking your score increases

\n\n

As mentioned, for new users the simplified reputation score is always 25, and it stays there for a while even if you post and comment a few times. Only once your raw score exceeds around 1,300,000,000 will you finally move up to a simplified score of 26. These are aproximate rounded numbers matching simplified scores to raw scores:

\n\n\n\n

Note that buying Steem or Steem Power does not increase reputation. Only posting, commenting, and curating will increase your rep. So if you'd like to see your reputation increase (and thus your curation power) the secret is to get engaging! Post and comment with some high-quality and original content.

\n\n

Further details

\n\n

While studying this, I also found the following post by @dantheman helpful. It doesn't quite cover the details I've described above, but it provides other useful info about how reputation works:

\n\n

Also note, it's possible these reputation calculations could change at any time. So if you've Googled and are reading this some time in the future, take care to go back to that GitHub page and check the code again.

\n\n
", "canonicalUrl": "https://hive.blog/steemit/@digitalnotvir/how-reputation-scores-are-calculated-the-details-explained-with-simple-math"},{"url": "https://hive.blog/dlux/@an-man/re-disregardfiat-2023310t74050909z", "probability": 0.6173098, "headline": "RE: DLUX DAO | Block Report 72882505", "datePublished": "2063-04-24T00:00:00", "datePublishedRaw": "an-man (63) in #dlux \u2022 last month", "author": "an-man", "authorsList": ["an-man"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/an-man/S8nsxzQS-PP_tif_056320-20Copy.png", "description": "Nice alright. !PIZZA Ill wait to add more till I get the word that it is running and the bugs are not an issue. If you need a test post at some point I have one I am working\u2026 by an-man", "articleBody": "Nice alright. !PIZZA\nIll wait to add more till I get the word that it is running and the bugs are not an issue.\nIf you need a test post at some point I have one I am working on, so let me know and I can post it to help see if it all works.\n!LUV", "articleBodyHtml": "
\n\n

Nice alright. !PIZZA
\nIll wait to add more till I get the word that it is running and the bugs are not an issue.
\nIf you need a test post at some point I have one I am working on, so let me know and I can post it to help see if it all works.
\n!LUV

\n\n
", "canonicalUrl": "https://ecency.com/dlux/@an-man/re-disregardfiat-2023310t74050909z"},{"url": "https://hive.blog/dlux/@an-man/re-disregardfiat-202338t73350759z", "probability": 0.8710753, "headline": "RE: DLUX DAO | Block Report 72882505", "datePublished": "2023-02-24T04:24:41.498882", "datePublishedRaw": "2 months ago", "author": "an-man", "authorsList": ["an-man"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/an-man/S8nsxzQS-PP_tif_056320-20Copy.png", "description": "Hello. Glad to hear from the team. I figured the block reports are auto content. Alright cool for as I am going through my travels I am amassing a lot more panoramic to share.\u2026 by an-man", "articleBody": "Hello. Glad to hear from the team.\nI figured the block reports are auto content.\nAlright cool for as I am going through my travels I am amassing a lot more panoramic to share.\nIt sounds like it is best to wait to post until the migration happens?\nSPK through the desktop app or website?\nThanks for your response and thanks for this project which I hope to contribute more to.\n!BEER", "articleBodyHtml": "
\n\n

Hello. Glad to hear from the team.
\nI figured the block reports are auto content.
\nAlright cool for as I am going through my travels I am amassing a lot more panoramic to share.
\nIt sounds like it is best to wait to post until the migration happens?
\nSPK through the desktop app or website?
\nThanks for your response and thanks for this project which I hope to contribute more to.
\n!BEER

\n\n
", "canonicalUrl": "https://ecency.com/dlux/@an-man/re-disregardfiat-202338t73350759z"},{"url": "https://hive.blog/steemhelp/@pfunk/a-learner-s-guide-to-using-steem-s-cliwallet-part-1", "probability": 0.9087962, "headline": "Steem Command Line Guide Part 1 - A Learner's Guide to Using cli_wallet", "datePublished": "2016-04-24T04:24:42.722528", "datePublishedRaw": "7 years ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/http://i.imgur.com/wyYwWTo.jpg", "images": ["https://images.hive.blog/768x0/http://i.imgur.com/wyYwWTo.jpg", "https://images.hive.blog/768x0/http://i.imgur.com/NTpIq4D.png", "https://images.hive.blog/768x0/http://i.imgur.com/6sKPLTW.png", "https://images.hive.blog/768x0/http://i.imgur.com/9oLQv97.png", "https://images.hive.blog/768x0/http://ic.pics.livejournal.com/heartsdesire456/12873778/459024/459024_600.jpg"], "description": "cli_wallet is the command-line program that connects to a Steem node synced with the blockchain to create/broadcast transactions and get info about the chain. It is useful for\u2026 by pfunk", "articleBody": "cli_wallet is the command-line program that connects to a Steem node synced with the blockchain to create/broadcast transactions and get info about the chain. It is useful for people who wish to interact with the Steem blockchain manually and directly to a node, rather than through steemit.com. Why? There are some Steem transactions and transaction variables that can't be done with Steemit's UI. Some might want to keep their more powerful active or owner account keys out of their web browser completely. It is also a way to get raw info about the blockchain directly from a node.\n\nIn this guide I will show you some basic Steem cli_wallet commands and teach you how to explore the others so you can learn more. This is slightly advanced but just about anyone can follow along and learn. Here, in part 1, I show some examples of basic commands and how to format them. Part 2 will cover more advanced commands, mainly the key change commands. Part 3 will cover some other useful commands.\n\nSince it would be a lot to write a guide for every command, I am taking a \"teach a man to fish\" approach. I hope this will be a jumping off point for anyone interested in learning more. Even if you don't plan on using the cli_wallet, these posts might be an informative look under the hood of the Steem network.\n\nBuild/get steemd and cli_wallet\n\nTo use the cli_wallet you need to connect to a local Steem node, a program named steemd. Follow these guides for getting one running and synced:\n\nLinux/Linux VM: @omotherhen 's bash scripts for easy Steem installation. Skip to chapter 2 in that post if not using a virtual machine.\nWindows: @tuck-fheman 's Windows mining guide using @bitcube 's latest Windows build and this steem-blocks-and-index.zip for much faster syncing. If the network is hardforked and there is no Windows build available, or you wish to build from source for more security, follow the VM guide or build your own. Skip entering the miner =, witness =, and mining-threads = lines into the config.ini as you don't need to be mining to run steemd.\n\nOnce the blockchain is synced, you should see messages in the steemd console like this:\n620878ms th_a application.cpp:439 handle_block ] Got 3 transactions from network on block 3448303\n\ncli_wallet\n\nSteem's basic command line software connects to steemd to get information and broadcast transactions. In a Linux build it can be found in steem/programs/cli_wallet/. Once steemd is synced, open a new terminal and enter ./cli_wallet in that directory to run it. With bitcube's Windows build, just double click cli_wallet.exe when synced.\n\nWhen you run it for the first time it will ask you to use set_password to make a password for the wallet.json file that it creates. Choose a reasonably strong password for this as you may be importing your more powerful keys into the wallet. Note: wallet passwords can't contain special characters for some reason. After setting a password, the wallet is locked, and you must unlock it with unlock followed by the password. You can re-lock it later with lock.\n\nhelp\n\nEnter help to get a long and perhaps confusingly laid out list of commands. I'll go into more detail about a few of these later. To the right of each command are parentheses that usually contain syntax hints for the particular command. Each command is listed with its type to the left. Some are for signing/broadcasting transactions, some query steemd for data, and some are internal wallet commands. I've left these categories out for legibility. These are the commands as of v0.12.1:\n\nabout() cancel_order(string owner, uint32_t orderid, bool broadcast) challenge(string challenger, string challenged, bool broadcast) change_recovery_account(string owner, string new_recovery_account, bool broadcast) convert_sbd(string from, asset amount, bool broadcast) create_account(string creator, string new_account_name, string json_meta, bool broadcast) create_account_with_keys(string creator, string newname, string json_meta, public_key_type owner, public_key_type active, public_key_type posting, public_key_type memo, bool broadcast) create_order(string owner, uint32_t order_id, asset amount_to_sell, asset min_to_receive, bool fill_or_kill, uint32_t expiration, bool broadcast) follow(string follower, string following, set what, bool broadcast) get_account(string account_name) get_account_history(string account, uint32_t from, uint32_t limit) get_active_witnesses() get_block(uint32_t num) get_conversion_requests(string owner) get_feed_history() get_inbox(string account, fc::time_point newest, uint32_t limit) get_miner_queue() get_open_orders(string accountname) get_order_book(uint32_t limit) get_outbox(string account, fc::time_point newest, uint32_t limit) get_owner_history(string account) get_private_key(public_key_type pubkey) get_private_key_from_password(string account, string role, string password) get_prototype_operation(string operation_type) get_state(string url) get_transaction(transaction_id_type trx_id) get_witness(string owner_account) gethelp(const string & method) help() import_key(string wif_key) info() is_locked() is_new() list_accounts(const string & lowerbound, uint32_t limit) list_keys() list_my_accounts() list_witnesses(const string & lowerbound, uint32_t limit) load_wallet_file(string wallet_filename) lock() network_add_nodes(const vector & nodes) network_get_connected_peers() normalize_brain_key(string s) post_comment(string author, string permlink, string parent_author, string parent_permlink, string title, string body, string json, bool broadcast) prove(string challenged, bool broadcast) publish_feed(string witness, price exchange_rate, bool broadcast) recover_account(string account_to_recover, authority recent_authority, authority new_authority, bool broadcast) request_account_recovery(string recovery_account, string account_to_recover, authority new_authority, bool broadcast) save_wallet_file(string wallet_filename) send_private_message(string from, string to, string subject, string body, bool broadcast) serialize_transaction(signed_transaction tx) set_password(string password) set_transaction_expiration(uint32_t seconds) set_voting_proxy(string account_to_modify, string proxy, bool broadcast) set_withdraw_vesting_route(string from, string to, uint16_t percent, bool auto_vest, bool broadcast) sign_transaction(signed_transaction tx, bool broadcast) suggest_brain_key() transfer(string from, string to, asset amount, string memo, bool broadcast) transfer_to_vesting(string from, string to, asset amount, bool broadcast) unlock(string password) update_account(string accountname, string json_meta, public_key_type owner, public_key_type active, public_key_type posting, public_key_type memo, bool broadcast) update_account_auth_account(string account_name, authority_type type, string auth_account, weight_type weight, bool broadcast) update_account_auth_key(string account_name, authority_type type, public_key_type key, weight_type weight, bool broadcast) update_account_auth_threshold(string account_name, authority_type type, uint32_t threshold, bool broadcast) update_account_memo_key(string account_name, public_key_type key, bool broadcast) update_account_meta(string account_name, string json_meta, bool broadcast) update_witness(string witness_name, string url, public_key_type block_signing_key, const chain_properties & props, bool broadcast) vote(string voter, string author, string permlink, int16_t weight, bool broadcast) vote_for_witness(string account_to_vote_with, string witness_to_vote_for, bool approve, bool broadcast) withdraw_vesting(string from, asset vesting_shares, bool broadcast)\n\ngethelp\n\nMost of the commands listed with help are documented fairly well by way of the gethelp command. Use this command to figure out what the syntax hints to the right of all the commands above really mean, and get further info on the function of the command. We'll use this as a syntax example: gethelp import_key\n\nimport_key\n\nHere is the output of gethelp import_key:\n\nImports a WIF Private Key into the wallet to be used to sign transactions by an account. example: import_key 5KQwrPbwdL6PhXujxW37FSSQZ1JiwsST4cqQzDeyXtP79zkvFD3 Parameters: wif_key: the WIF Private Key to import (type: string)\n\nSimple enough, enter import_key followed by one of your account's private keys, maybe starting with the posting key for safety for now. More on keys here. You can get your private posting key by visiting https://steemit.com/@youraccountname/permissions, and then clicking the Show private key button. Be sure to have a good wallet password if you import your active key, and only import your owner key when you are going to change it. You can import as many keys as you want, for as many accounts as you want.\n\ntransfer\n\nOne of the most basic functions of a cryptocurrency network since Bitcoin is transferring funds. The transfer command is able to send liquid currencies on the Steem network, STEEM and SBD (Steem Dollars), to other accounts. Here's the output of gethelp transfer:\n\nTransfer funds from one account to another. STEEM and SBD can be transferred. Parameters: from: The account the funds are coming from (type: string) to: The account the funds are going to (type: string) amount: The funds being transferred. i.e. \"100.000 STEEM\" (type: asset) memo: A memo for the transactionm, encrypted with the to account's public memo key (type: string) broadcast: true if you wish to broadcast the transaction (type: bool)\n\nThus the syntax for the transfer command is:\n\ntransfer ned pfunk \"1000.000 STEEM\" \"high-five\" true\n\nAnything that is a string technically should be in quotes, although for account names that don't have dashes or periods in them, you can get away with skipping them.\n\nThe first account string is the sender, for which you must have the account's active or owner key imported.\nThe second account string is the receiver.\nThird in the syntax is the asset string, which must be in quotes, and use every decimal that the asset uses (both STEEM and SBD use 3 decimal places, and trailing zeroes must be included) followed by the asset abbreviation.\nFourth is the memo string, which needs to be included whether you want to leave a memo or not. Simply use \"\" for no memo.\nLast is the boolean (true/false) variable of whether or not to broadcast the transaction. If you want the transfer to go through, you'll enter true.\n\nlist_my_accounts\n\nHow do we know our account's balances? The list_my_accounts command will show us our STEEM balance, VESTS (the internal Steem unit that is Steem Power, denominated differently), and SBD (Steem Dollars) for any account that we have any key imported for. It will list all of the accounts that you have imported keys for. Sample output:\n\ninfo\n\nThis reports back to you a good deal of the current Steem network info. Much of this data can be found on steemd.com as well. info is useful to see the progress of your blockchain sync, as it will tell you how old the timestamp of the last block is: \"head_block_age\": \"1 second old\",\n\nabout\n\nabout\n\nvote and post_comment\n\nYou can vote and post (post_comment) from the command line! It's not so easy to format the command yourself, so I suggest using steemd.com 's advanced mode on any post or comment you'd like to vote or comment on.\n\nClick the your-acct link (orange arrow) to enter your account name to be autofilled. Then copy and paste the given text into your cli_wallet. When commenting, change the \"your reply..\" text to your own comment of course.\n\nOne thing that cli voting allows you to do that Steemit does not (yet) is to adjust the weight of your votes. This allows you to conserve voting power or select exactly how much you want to upvote or downvote. The 100 before the 'true' in the command is the voting weight variable. It can be anywhere from -100 to 100, with negative votes being a downvote, 0-weighted votes being an un-vote, and 100 being a full-power (normal) upvote.\n\nThat's it for part 1. This is meant to be a jumping off point, so anyone who reads this can now probably figure out almost every command available in the cli_wallet. In part 2 I will cover the key changing commands in-depth.", "articleBodyHtml": "
\n\n

cli_wallet is the command-line program that connects to a Steem node synced with the blockchain to create/broadcast transactions and get info about the chain. It is useful for people who wish to interact with the Steem blockchain manually and directly to a node, rather than through steemit.com. Why? There are some Steem transactions and transaction variables that can't be done with Steemit's UI. Some might want to keep their more powerful active or owner account keys out of their web browser completely. It is also a way to get raw info about the blockchain directly from a node.

\n\n

In this guide I will show you some basic Steem cli_wallet commands and teach you how to explore the others so you can learn more. This is slightly advanced but just about anyone can follow along and learn. Here, in part 1, I show some examples of basic commands and how to format them. Part 2 will cover more advanced commands, mainly the key change commands. Part 3 will cover some other useful commands.

\n\n
\n\n

Since it would be a lot to write a guide for every command, I am taking a \"teach a man to fish\" approach. I hope this will be a jumping off point for anyone interested in learning more. Even if you don't plan on using the cli_wallet, these posts might be an informative look under the hood of the Steem network.

\n\n

Build/get steemd and cli_wallet

\n\n

To use the cli_wallet you need to connect to a local Steem node, a program named steemd. Follow these guides for getting one running and synced:

\n\n\n\n

Once the blockchain is synced, you should see messages in the steemd console like this:
\n620878ms th_a application.cpp:439 handle_block ] Got 3 transactions from network on block 3448303

\n\n

cli_wallet

\n\n

Steem's basic command line software connects to steemd to get information and broadcast transactions. In a Linux build it can be found in steem/programs/cli_wallet/. Once steemd is synced, open a new terminal and enter ./cli_wallet in that directory to run it. With bitcube's Windows build, just double click cli_wallet.exe when synced.

\n\n

When you run it for the first time it will ask you to use set_password to make a password for the wallet.json file that it creates. Choose a reasonably strong password for this as you may be importing your more powerful keys into the wallet. Note: wallet passwords can't contain special characters for some reason. After setting a password, the wallet is locked, and you must unlock it with unlock followed by the password. You can re-lock it later with lock.

\n\n
\"C01setpassword\"
\n
Make your password random and unlikely to ever be said (this is obviously a bad example)
\n\n

help

\n\n

Enter help to get a long and perhaps confusingly laid out list of commands. I'll go into more detail about a few of these later. To the right of each command are parentheses that usually contain syntax hints for the particular command. Each command is listed with its type to the left. Some are for signing/broadcasting transactions, some query steemd for data, and some are internal wallet commands. I've left these categories out for legibility. These are the commands as of v0.12.1:

\n\n
about()                         \ncancel_order(string owner, uint32_t orderid, bool broadcast)\nchallenge(string challenger, string challenged, bool broadcast)\nchange_recovery_account(string owner, string new_recovery_account, bool broadcast)\nconvert_sbd(string from, asset amount, bool broadcast)\ncreate_account(string creator, string new_account_name, string json_meta, bool broadcast)\ncreate_account_with_keys(string creator, string newname, string json_meta, public_key_type owner, public_key_type active, public_key_type posting, public_key_type memo, bool broadcast)\ncreate_order(string owner, uint32_t order_id, asset amount_to_sell, asset min_to_receive, bool fill_or_kill, uint32_t expiration, bool broadcast)\nfollow(string follower, string following, set<string> what, bool broadcast)\nget_account(string account_name)\nget_account_history(string account, uint32_t from, uint32_t limit)\nget_active_witnesses()\nget_block(uint32_t num)\nget_conversion_requests(string owner)\nget_feed_history()\nget_inbox(string account, fc::time_point newest, uint32_t limit)\nget_miner_queue()\nget_open_orders(string accountname)\nget_order_book(uint32_t limit)\nget_outbox(string account, fc::time_point newest, uint32_t limit)\nget_owner_history(string account)\nget_private_key(public_key_type pubkey)\nget_private_key_from_password(string account, string role, string password)\nget_prototype_operation(string operation_type)\nget_state(string url)\nget_transaction(transaction_id_type trx_id)\nget_witness(string owner_account)\ngethelp(const string & method)\nhelp()\nimport_key(string wif_key)\ninfo()\nis_locked()\nis_new()\nlist_accounts(const string & lowerbound, uint32_t limit)\nlist_keys()\nlist_my_accounts()\nlist_witnesses(const string & lowerbound, uint32_t limit)\nload_wallet_file(string wallet_filename)\nlock()\nnetwork_add_nodes(const vector<string> & nodes)\nnetwork_get_connected_peers()\nnormalize_brain_key(string s)\npost_comment(string author, string permlink, string parent_author, string parent_permlink, string title, string body, string json, bool broadcast)\nprove(string challenged, bool broadcast)\npublish_feed(string witness, price exchange_rate, bool broadcast)\nrecover_account(string account_to_recover, authority recent_authority, authority new_authority, bool broadcast)\nrequest_account_recovery(string recovery_account, string account_to_recover, authority new_authority, bool broadcast)\nsave_wallet_file(string wallet_filename)\nsend_private_message(string from, string to, string subject, string body, bool broadcast)\nserialize_transaction(signed_transaction tx)\nset_password(string password)\nset_transaction_expiration(uint32_t seconds)\nset_voting_proxy(string account_to_modify, string proxy, bool broadcast)\nset_withdraw_vesting_route(string from, string to, uint16_t percent, bool auto_vest, bool broadcast)\nsign_transaction(signed_transaction tx, bool broadcast)\nsuggest_brain_key()\ntransfer(string from, string to, asset amount, string memo, bool broadcast)\ntransfer_to_vesting(string from, string to, asset amount, bool broadcast)\nunlock(string password)\nupdate_account(string accountname, string json_meta, public_key_type owner, public_key_type active, public_key_type posting, public_key_type memo, bool broadcast)\nupdate_account_auth_account(string account_name, authority_type type, string auth_account, weight_type weight, bool broadcast)\nupdate_account_auth_key(string account_name, authority_type type, public_key_type key, weight_type weight, bool broadcast)\nupdate_account_auth_threshold(string account_name, authority_type type, uint32_t threshold, bool broadcast)\nupdate_account_memo_key(string account_name, public_key_type key, bool broadcast)\nupdate_account_meta(string account_name, string json_meta, bool broadcast)\nupdate_witness(string witness_name, string url, public_key_type block_signing_key, const chain_properties & props, bool broadcast)\nvote(string voter, string author, string permlink, int16_t weight, bool broadcast)\nvote_for_witness(string account_to_vote_with, string witness_to_vote_for, bool approve, bool broadcast)\nwithdraw_vesting(string from, asset vesting_shares, bool broadcast)\n
\n\n

gethelp

\n\n

Most of the commands listed with help are documented fairly well by way of the gethelp command. Use this command to figure out what the syntax hints to the right of all the commands above really mean, and get further info on the function of the command. We'll use this as a syntax example: gethelp import_key

\n\n

import_key

\n\n

Here is the output of gethelp import_key:

\n\n
Imports a WIF Private Key into the wallet to be used to sign transactions by an account.\n\nexample: import_key 5KQwrPbwdL6PhXujxW37FSSQZ1JiwsST4cqQzDeyXtP79zkvFD3\n\nParameters:\nwif_key: the WIF Private Key to import (type: string)\n
\n\n

Simple enough, enter import_key followed by one of your account's private keys, maybe starting with the posting key for safety for now. More on keys here. You can get your private posting key by visiting https://steemit.com/@youraccountname/permissions, and then clicking the Show private key button. Be sure to have a good wallet password if you import your active key, and only import your owner key when you are going to change it. You can import as many keys as you want, for as many accounts as you want.

\n\n

transfer

\n\n

One of the most basic functions of a cryptocurrency network since Bitcoin is transferring funds. The transfer command is able to send liquid currencies on the Steem network, STEEM and SBD (Steem Dollars), to other accounts. Here's the output of gethelp transfer:

\n\n
Transfer funds from one account to another. STEEM and SBD can be\ntransferred.\n\nParameters:\n    from: The account the funds are coming from (type: string)\n    to: The account the funds are going to (type: string)\n    amount: The funds being transferred. i.e. \"100.000 STEEM\" (type: asset)\n    memo: A memo for the transactionm, encrypted with the to account's\n        public memo key (type: string)\n    broadcast: true if you wish to broadcast the transaction (type: bool)\n
\n\n

Thus the syntax for the transfer command is:

\n\n

transfer ned pfunk \"1000.000 STEEM\" \"high-five\" true

\n\n

Anything that is a string technically should be in quotes, although for account names that don't have dashes or periods in them, you can get away with skipping them.

\n\n
  1. The first account string is the sender, for which you must have the account's active or owner key imported.
  2. \n
  3. The second account string is the receiver.
  4. \n
  5. Third in the syntax is the asset string, which must be in quotes, and use every decimal that the asset uses (both STEEM and SBD use 3 decimal places, and trailing zeroes must be included) followed by the asset abbreviation.
  6. \n
  7. Fourth is the memo string, which needs to be included whether you want to leave a memo or not. Simply use \"\" for no memo.
  8. \n
  9. Last is the boolean (true/false) variable of whether or not to broadcast the transaction. If you want the transfer to go through, you'll enter true.
\n\n

list_my_accounts

\n\n

How do we know our account's balances? The list_my_accounts command will show us our STEEM balance, VESTS (the internal Steem unit that is Steem Power, denominated differently), and SBD (Steem Dollars) for any account that we have any key imported for. It will list all of the accounts that you have imported keys for. Sample output:

\n\n

info

\n\n

This reports back to you a good deal of the current Steem network info. Much of this data can be found on steemd.com as well. info is useful to see the progress of your blockchain sync, as it will tell you how old the timestamp of the last block is: \"head_block_age\": \"1 second old\",

\n\n

about

\n\n

about

\n\n

vote and post_comment

\n\n

You can vote and post (post_comment) from the command line! It's not so easy to format the command yourself, so I suggest using steemd.com's advanced mode on any post or comment you'd like to vote or comment on.

\n\n
\"C02steemd\"
\n\n
\"C03steemd\"
\n\n

Click the your-acct link (orange arrow) to enter your account name to be autofilled. Then copy and paste the given text into your cli_wallet. When commenting, change the \"your reply..\" text to your own comment of course.

\n\n

One thing that cli voting allows you to do that Steemit does not (yet) is to adjust the weight of your votes. This allows you to conserve voting power or select exactly how much you want to upvote or downvote. The 100 before the 'true' in the command is the voting weight variable. It can be anywhere from -100 to 100, with negative votes being a downvote, 0-weighted votes being an un-vote, and 100 being a full-power (normal) upvote.

\n\n

That's it for part 1. This is meant to be a jumping off point, so anyone who reads this can now probably figure out almost every command available in the cli_wallet. In part 2 I will cover the key changing commands in-depth.

\n\n
", "canonicalUrl": "https://hive.blog/steemhelp/@pfunk/a-learner-s-guide-to-using-steem-s-cliwallet-part-1"},{"url": "https://hive.blog/ai/@sachingeorge/re-disregardfiat-202334t182156685z", "probability": 0.8537826, "headline": "RE: AI-Generated Content = Not Original Content", "datePublished": "2023-02-24T04:24:49.825604", "datePublishedRaw": "2 months ago", "author": "sachingeorge", "authorsList": ["sachingeorge"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://images.ecency.com/DQmcyhTuwNdeuenvKwboutMXB8b9GS9oHAMMNiWyPadpGNj/download_2023_04_15t144556.384.png", "description": "That's a very stupid statement to be frank. Are you saying centralised system is free of corruption ? You just need to see crypto as a decentralised payment option, nothing more\u2026 by sachingeorge", "articleBody": "That's a very stupid statement to be frank. Are you saying centralised system is free of corruption ? You just need to see crypto as a decentralised payment option, nothing more, nothing less anything more is due to greed.", "articleBodyHtml": "
\n\n

That's a very stupid statement to be frank. Are you saying centralised system is free of corruption ? You just need to see crypto as a decentralised payment option, nothing more, nothing less anything more is due to greed.

\n\n
", "canonicalUrl": "https://ecency.com/ai/@sachingeorge/re-disregardfiat-202334t182156685z"},{"url": "https://hive.blog/hive-112019/@jelly13/rpw32s", "probability": 0.684961, "headline": "RE: Novel Voting Mechanism - SPK Network Team Meeting", "datePublished": "2023-02-24T04:24:55.377222", "datePublishedRaw": "2 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://images.hive.blog/DQmYmVSS3CDAC5SgC8527RPFGM1g8KD2Qk4zgdBrQ772w53/jellyPA1.gif", "description": "OK. Why would any validator power up SPK on their account rather than alt? Why did you say powering up on alt can be beneficial for someone owning more than 1/20 of apathetic stake? by jelly13", "articleBody": "[-]\n\ndisregardfiat (73) 2 months ago\n\nThe account will earn Broca for it's powered SPK. So for instance spknetwork will want to consolidate their stake to enable running their services. This accumulation should make people want to vote for them as a validator. If you must exercise a >5% stake powering it over 2 accounts would allow you to vote with a bigger stake, but most people would be wondering why an account like @ ranchorelaxo votes identically to haejin; possibly enough to not be voted as a validator... but not like it would matter, you can still exercise your stake with the extra step.\n\n$0.00 Reply\n\n[-]\n\njelly13 (-15) (1) 2 months ago $0.00 Reveal Comment\n\n[-]\n\ndisregardfiat (73) 2 months ago\n\nThe Broca market is a little weird. It's like food assistance where you get them but you have to use on certain goods. They can't just be redeemed for Hive/SPK/HBD/etc... Of course people will sell them and they should have some fungible uses; @ spknetwork for instance will place their Broca in contracts for their users to have free access to their platform... but for precisely this reason I'm hoping not to build direct Broca sales... but continue to utilize service contracts to transform Broca into tokens that pay for the service providers.\n\nThis does get into some roles of the validators. Say @ alice.alt has her 10K SPK powered, and builds self directed contracts to upload and store some data. @ alice.alt is providing additional services like upload and storage. If she fails to perform services for the wider market and is only interesting in cleaning broca her accounts reputation will go down, and her pass thru rate will suffer.\n\nAlternatively, if she has an elected validator she'll be more capable of earning with her powered SPK as the good standing she has will carry over to other services she provides.\n\nIt's a hard problem and we're here hoping to plug these holes to the best of our collective ability and hearing these questions has already changed some of the code for the better. In short, aligning incentives should help everybody build this network together... Voting and reputation are tied a bit together and the validators will earn more ready-to-redeem broca through validating the contracts that are assigned to them... this will have a SPK weight to it as well.", "articleBodyHtml": "
\n\n

[-]

\n\n

disregardfiat (73)\u00a02 months ago

\n\n

The account will earn Broca for it's powered SPK. So for instance spknetwork will want to consolidate their stake to enable running their services. This accumulation should make people want to vote for them as a validator. If you must exercise a >5% stake powering it over 2 accounts would allow you to vote with a bigger stake, but most people would be wondering why an account like @ ranchorelaxo votes identically to haejin; possibly enough to not be voted as a validator... but not like it would matter, you can still exercise your stake with the extra step.

\n\n

$0.00Reply

\n\n

[-]

\n\n

jelly13 (-15)(1)\u00a02 months ago\u00a0\u00a0$0.00Reveal Comment

\n\n

[-]

\n\n

disregardfiat (73)\u00a02 months ago

\n\n

The Broca market is a little weird. It's like food assistance where you get them but you have to use on certain goods. They can't just be redeemed for Hive/SPK/HBD/etc... Of course people will sell them and they should have some fungible uses; @ spknetwork for instance will place their Broca in contracts for their users to have free access to their platform... but for precisely this reason I'm hoping not to build direct Broca sales... but continue to utilize service contracts to transform Broca into tokens that pay for the service providers.

\n\n

This does get into some roles of the validators. Say @ alice.alt has her 10K SPK powered, and builds self directed contracts to upload and store some data. @ alice.alt is providing additional services like upload and storage. If she fails to perform services for the wider market and is only interesting in cleaning broca her accounts reputation will go down, and her pass thru rate will suffer.

\n\n

Alternatively, if she has an elected validator she'll be more capable of earning with her powered SPK as the good standing she has will carry over to other services she provides.

\n\n

It's a hard problem and we're here hoping to plug these holes to the best of our collective ability and hearing these questions has already changed some of the code for the better. In short, aligning incentives should help everybody build this network together... Voting and reputation are tied a bit together and the validators will earn more ready-to-redeem broca through validating the contracts that are assigned to them... this will have a SPK weight to it as well.

\n\n
", "canonicalUrl": "https://hive.blog/hive-112019/@jelly13/rpw32s"},{"url": "https://hive.blog/hive-112019/@jelly13/rpxm2b", "probability": 0.90113217, "headline": "RE: Novel Voting Mechanism - SPK Network Team Meeting", "datePublished": "2023-02-24T04:24:55.708703", "datePublishedRaw": "2 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://images.hive.blog/DQmYmVSS3CDAC5SgC8527RPFGM1g8KD2Qk4zgdBrQ772w53/jellyPA1.gif", "description": "All good, I like the sentiment. I felt I was missing something but it turns out I probably wasn't so there s no need to dig into one loose statement any longer. TBH, I do not\u2026 by jelly13", "articleBody": "All good, I like the sentiment. I felt I was missing something but it turns out I probably wasn't so there s no need to dig into one loose statement any longer. TBH, I do not even think this is a hole, just a nuisance. I am not sure if there is any point to motivate validators to exhibit collecting non-voting SPK Power but I do not really care because I can see the setup allows the water to find its natural way eventually.\n\nFTR, I am fine with Alice. When she picked the name for her @ alice.alt account, she demonstrated that she is going to run the business honestly and openly. She only follows a practice that the code dictates. I will gladly unvote anyone trying to collect votes by criticising her.", "articleBodyHtml": "
\n\n

All good, I like the sentiment. I felt I was missing something but it turns out I probably wasn't so there s no need to dig into one loose statement any longer. TBH, I do not even think this is a hole, just a nuisance. I am not sure if there is any point to motivate validators to exhibit collecting non-voting SPK Power but I do not really care because I can see the setup allows the water to find its natural way eventually.

\n\n

FTR, I am fine with Alice. When she picked the name for her @ alice.alt account, she demonstrated that she is going to run the business honestly and openly. She only follows a practice that the code dictates. I will gladly unvote anyone trying to collect votes by criticising her.

\n\n
", "canonicalUrl": "https://hive.blog/hive-112019/@jelly13/rpxm2b"},{"url": "https://hive.blog/communityfork/@hiveio/announcing-the-launch-of-hive-blockchain", "probability": 0.8897418, "headline": "Announcing the Launch of Hive Blockchain", "datePublished": "2020-04-24T04:24:53.893070", "datePublishedRaw": "3 years ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.steempeak.com/file/steempeak/hiveio/4WaiXF2y-opengraph3x.png", "images": ["https://images.hive.blog/768x0/https://files.steempeak.com/file/steempeak/hiveio/4WaiXF2y-opengraph3x.png", "https://images.hive.blog/768x0/https://cdn.steemitimages.com/DQmZmqyLWRqXCHKV26ujVQ4EWted7ZXq6zgrTnYe7CzBPYi/1584471708229.png"], "description": "After weeks of hard teamwork, we're announcing the launch of the new Hive blockchain. by hiveio", "articleBody": "The buzz is real! After weeks of hard teamwork, we're announcing the launch of the new Hive blockchain.\n\nThis is an exciting time, and an opportunity to channel the love and power of a resilient and tenacious community into collaborative development for the future of social blockchain.\n\nHive is a passionate effort, created by a large group of Steem community members who have long looked to move towards true decentralization and to help develop the code base. The years of distribution issues and reliance on a central entity for code and infrastructure has been at the heart of a revolution of sorts, and the new Hive blockchain is the culmination of stepping up to meet the challenge of returning to shared values of protecting and celebrating the vibrant community that has grown around our ecosystem.\n\nCheck out the FAQ below for info on the launch of Hive.\n\nWhat is Hive and why was it created?\n\nHive is a DPOS governance blockchain created by implementing a hard fork of existing Steem code.\n\nThe intention of this community-driven fork is to support and build on the strong Steem community values that have made our ecosystem so diverse and exciting. This new direction steps away from the burden of the Steemit Inc. ninja-mined stake, which has impacted the long term ability to work towards further development and decentralization for years.\n\nHive has begun with a talented and committed team of community developers who are already paving the way to implement much awaited improvements and robust new developments to the blockchain. It is exciting to see the community \u2013 from devs and business owners to passionate end users \u2013 stepping forward to embrace and contribute to the potential of Hive. This renewed spirit, combined with a renewed codebase and a focus on working more closely with the entire ecosystem, is key to the success and possibilities for the future of social blockchain.\n\nWhat is the difference between Hive and Steem?\n\nHive is a fork of the Steem blockchain and runs completely independently from the existing chain. All valid accounts on the Steem blockchain will receive an airdrop on Hive. (more details below)\n\nWhile the Hive chain is a fork of the Steem code, the intention is to take responsibility to build something even better. The initial Hive launch will be a direct copy of existing blockchain with a few small upgrades, which will allow us to get back to community discussion on the direction for future development, needed changes, and most wanted chain-level features.\n\nThe spirit of Steem and the goal of decentralization has always been to move away from a single point of authority, and to capitalize on and encourage the potential that has gathered in our community. Hive is drawing on the huge range of talent in our ecosystem to accelerate development, improve communication, and return to a focus on ease of use, onboarding, and marketing. The desire to move towards better decentralization and to connect people via blockchain has never been lost, and it's what makes Hive so needed now.\n\nWhat happens when Hive launches?\n\nWhen The Hive blockchain launches, existing Steem accounts will appear on the Hive Network as well. You will have two accounts: the current one on the Steem, and a new Hive account, which will be pre-populated with all of your current Steem content and information.\n\nThe history of these two accounts will be the same, but from the snapshot point onward, the accounts will be independent from one another. Actions you take on one network will not be reflected on the other. This means that future content and transactions will belong to whichever account you use.\n\nIf you post on Hive, it will not show up on Steem, and vice versa. The two chains will be completely independent of one another after launch.\n\nWhen will Hive launch?\n\nThe Hive blockchain launch will be at 10am EST/14:00 UTC, Friday, March 20th. (countdown at https://hive.io/) All of the magic will happen this week! The snapshot and airdrop will take place back to back at the time of launch.\n\nWhere can I find Hive's code?\n\nAs a STEEM holder, what do I get in the Hive airdrop?\n\nAll valid STEEM stakeholders will receive a perfectly mirrored balance of their current STEEM holdings in the new HIVE coin. This will include matching amounts of current liquid STEEM, Steem Power, and SBD. All other account state data, such as claimed accounts, delegations, etc. will be mirrored onto Hive.\n\nFor example: if your Steem account has 10 STEEM, 5 SBD, and 1000 SP, your new Hive account will have 10 HIVE, 5 HBD, and 1000 HP.\n\nYou will not need to claim anything. Simply log in with your existing Steem account details and you will have your new HIVE coins in your wallet and be able to start using the blockchain.\n\nThe Hive airdrop will only be performed on the current version of the Steem blockchain (as of the date of this post). Any \"emergency hardforks\" performed prior to the airdrop will not be eligible. Further, any exchange that participates in such a hardfork prior to the airdrop date will null and void the ability to participate in the airdrop for its off-chain balance holding customers.\n\nHow do I access my Hive account to post content and send Hive coins?\n\nYou can access your Hive account by simply logging in on the Hive Network using your existing Steem account keys. The first frontend available will be https://hive.blog. This is currently pointed at the Steem blockchain, but will be switched to Hive at launch time. Other popular interfaces are completing the switch over to Hive or finishing new products, and will be announcing when they are ready to use!\n\nWho is behind the development of Hive?\n\nThere are currently over 30 experienced developers contributing to the new ecosystem, alongside many committed community members working on multiple aspects of Hive. These individuals may announce their participation through their own accounts in the future. This a true community effort, and therefore is open to all. Hive will remain open source, and open to everyone who wants to contribute to the future development of the blockchain. Testnet will be available after this post is published, so feel free to comment here if you would like to access and we'll reach out. The full codebase will move to a public repo on Thursday, March 19th, prior to launch. Updates will be made to this post and via this account.\n\nWhat improvements have been added?\n\nThe most important improvements and decisions will be made after initial launch with a chance for proper feedback from the community. To successfully launch, there are some aspects that needed to be addressed immediately, including prevention of exchanges from participating in governance attacks (as we witnessed on the Steem Blockchain).\n\nTo prevent governance (or funding) attacks, a 30 day delay has been added on crediting vests towards witness and SPS votes. Further governance changes will need to be developed alongside the community.\n\nThis delay means that after an account \u201cpowers up\u201d or stakes funds, there will be a 30 day delay before those vests can be used towards voting on governance (witnesses) or through voting in the SPS. For all other actions, there is no delay and vests are immediately available.\n\nThis is an initial way for us to mitigate this risk while continuing to improve the system as a whole, and without deciding on governance changes before returning to collection of wider community input.\n\nWill all Steem accounts be included in the HIVE airdrop?\n\nThe goal of Hive is to continue moving towards true decentralization, and therefore the launch airdrop will include all accounts who have showcased these same values and shared goals.\n\nThe only accounts who will not be included in the initial airdrop are those containing the Steemit Inc ninja-mined stake, and those who actively contributed to (and publicly declared support for) the centralization of the Steem Blockchain.\n\nThese accounts can still choose to take part in the new chain and their accounts will exist, but they will not be included in the intial airdrop.\n\nImportant notes for cryptocurrency exchanges that list STEEM\n\nDo you plan to support the Hive airdrop by taking a snapshot of your customer\u2019s Steem holdings at the time of the airdrop and credit them with a similar amount of Hive (Yes/No)?\n\nYES) If you DO want to participate in the airdrop, please notify the Hive team that your exchange will distribute the airdrop on a pro-rata basis to your Steem customers (no fixed timeline for distribution is required), and the Hive account that corresponds to your Steem account will receive the airdrop at the time of the launch of Hive.\n\nNO) If you DO NOT want to participate in the airdrop, please send out a notification to your Steem-holding users to forewarn them that you will not be participating, so that they can temporarily withdraw their Steem from your exchange before the airdrop snapshot date, if they wish to participate in the Hive airdrop.\n\nIf at all possible, please let us know by Thursday, March 19 if you will or will not be participating in the airdrop.\n\nYou can notify us by leaving a message on this post using your official Steem account, or you can contact one of our exchange liaisons:\n\nDirect follow-ups will be made, but if we do not receive an affirmative response by days end Thursday, March 19th, we will assume that your exchange doesn\u2019t wish to participate in the airdrop for holding customer balances.\n\nWe're looking forward to the launch of Hive!\n\nThere's a lot to come during this extremely busy week, and this post and account will be updated to reflect new information as it becomes available.", "articleBodyHtml": "
\n\n
\n\n

The buzz is real! After weeks of hard teamwork, we're announcing the launch of the new Hive blockchain.

\n\n

This is an exciting time, and an opportunity to channel the love and power of a resilient and tenacious community into collaborative development for the future of social blockchain.

\n\n

Hive is a passionate effort, created by a large group of Steem community members who have long looked to move towards true decentralization and to help develop the code base. The years of distribution issues and reliance on a central entity for code and infrastructure has been at the heart of a revolution of sorts, and the new Hive blockchain is the culmination of stepping up to meet the challenge of returning to shared values of protecting and celebrating the vibrant community that has grown around our ecosystem.

\n\n

Check out the FAQ below for info on the launch of Hive.

\n\n


\n\n

What is Hive and why was it created?

\n\n

Hive is a DPOS governance blockchain created by implementing a hard fork of existing Steem code.

\n\n

The intention of this community-driven fork is to support and build on the strong Steem community values that have made our ecosystem so diverse and exciting. This new direction steps away from the burden of the Steemit Inc. ninja-mined stake, which has impacted the long term ability to work towards further development and decentralization for years.

\n\n

Hive has begun with a talented and committed team of community developers who are already paving the way to implement much awaited improvements and robust new developments to the blockchain. It is exciting to see the community \u2013 from devs and business owners to passionate end users \u2013 stepping forward to embrace and contribute to the potential of Hive. This renewed spirit, combined with a renewed codebase and a focus on working more closely with the entire ecosystem, is key to the success and possibilities for the future of social blockchain.
\n

\n\n

What is the difference between Hive and Steem?

\n\n

Hive is a fork of the Steem blockchain and runs completely independently from the existing chain. All valid accounts on the Steem blockchain will receive an airdrop on Hive. (more details below)

\n\n

While the Hive chain is a fork of the Steem code, the intention is to take responsibility to build something even better. The initial Hive launch will be a direct copy of existing blockchain with a few small upgrades, which will allow us to get back to community discussion on the direction for future development, needed changes, and most wanted chain-level features.

\n\n

The spirit of Steem and the goal of decentralization has always been to move away from a single point of authority, and to capitalize on and encourage the potential that has gathered in our community. Hive is drawing on the huge range of talent in our ecosystem to accelerate development, improve communication, and return to a focus on ease of use, onboarding, and marketing. The desire to move towards better decentralization and to connect people via blockchain has never been lost, and it's what makes Hive so needed now.
\n

\n\n

What happens when Hive launches?

\n\n

When The Hive blockchain launches, existing Steem accounts will appear on the Hive Network as well. You will have two accounts: the current one on the Steem, and a new Hive account, which will be pre-populated with all of your current Steem content and information.

\n\n

The history of these two accounts will be the same, but from the snapshot point onward, the accounts will be independent from one another. Actions you take on one network will not be reflected on the other. This means that future content and transactions will belong to whichever account you use.

\n\n
If you post on Hive, it will not show up on Steem, and vice versa. The two chains will be completely independent of one another after launch.
\n\n


\n\n

When will Hive launch?

\n\n

The Hive blockchain launch will be at 10am EST/14:00 UTC, Friday, March 20th. (countdown at https://hive.io/) All of the magic will happen this week! The snapshot and airdrop will take place back to back at the time of launch.
\n

\n\n

Where can I find Hive's code?

\n\n


\n\n

As a STEEM holder, what do I get in the Hive airdrop?

\n\n

All valid STEEM stakeholders will receive a perfectly mirrored balance of their current STEEM holdings in the new HIVE coin. This will include matching amounts of current liquid STEEM, Steem Power, and SBD. All other account state data, such as claimed accounts, delegations, etc. will be mirrored onto Hive.

\n\n

For example: if your Steem account has 10 STEEM, 5 SBD, and 1000 SP, your new Hive account will have 10 HIVE, 5 HBD, and 1000 HP.

\n\n

You will not need to claim anything. Simply log in with your existing Steem account details and you will have your new HIVE coins in your wallet and be able to start using the blockchain.

\n\n
The Hive airdrop will only be performed on the current version of the Steem blockchain (as of the date of this post). Any \"emergency hardforks\" performed prior to the airdrop will not be eligible. Further, any exchange that participates in such a hardfork prior to the airdrop date will null and void the ability to participate in the airdrop for its off-chain balance holding customers.
\n\n


\n\n

How do I access my Hive account to post content and send Hive coins?

\n\n

You can access your Hive account by simply logging in on the Hive Network using your existing Steem account keys. The first frontend available will be https://hive.blog. This is currently pointed at the Steem blockchain, but will be switched to Hive at launch time. Other popular interfaces are completing the switch over to Hive or finishing new products, and will be announcing when they are ready to use!
\n

\n\n

Who is behind the development of Hive?

\n\n

There are currently over 30 experienced developers contributing to the new ecosystem, alongside many committed community members working on multiple aspects of Hive. These individuals may announce their participation through their own accounts in the future. This a true community effort, and therefore is open to all. Hive will remain open source, and open to everyone who wants to contribute to the future development of the blockchain. Testnet will be available after this post is published, so feel free to comment here if you would like to access and we'll reach out. The full codebase will move to a public repo on Thursday, March 19th, prior to launch. Updates will be made to this post and via this account.
\n

\n\n

What improvements have been added?

\n\n

The most important improvements and decisions will be made after initial launch with a chance for proper feedback from the community. To successfully launch, there are some aspects that needed to be addressed immediately, including prevention of exchanges from participating in governance attacks (as we witnessed on the Steem Blockchain).

\n\n

To prevent governance (or funding) attacks, a 30 day delay has been added on crediting vests towards witness and SPS votes. Further governance changes will need to be developed alongside the community.

\n\n

This delay means that after an account \u201cpowers up\u201d or stakes funds, there will be a 30 day delay before those vests can be used towards voting on governance (witnesses) or through voting in the SPS. For all other actions, there is no delay and vests are immediately available.

\n\n

This is an initial way for us to mitigate this risk while continuing to improve the system as a whole, and without deciding on governance changes before returning to collection of wider community input.
\n

\n\n

Will all Steem accounts be included in the HIVE airdrop?

\n\n

The goal of Hive is to continue moving towards true decentralization, and therefore the launch airdrop will include all accounts who have showcased these same values and shared goals.

\n\n

The only accounts who will not be included in the initial airdrop are those containing the Steemit Inc ninja-mined stake, and those who actively contributed to (and publicly declared support for) the centralization of the Steem Blockchain.

\n\n

These accounts can still choose to take part in the new chain and their accounts will exist, but they will not be included in the intial airdrop.
\n

\n\n

Important notes for cryptocurrency exchanges that list STEEM

\n\n
Do you plan to support the Hive airdrop by taking a snapshot of your customer\u2019s Steem holdings at the time of the airdrop and credit them with a similar amount of Hive (Yes/No)?
\n\n

YES) If you DO want to participate in the airdrop, please notify the Hive team that your exchange will distribute the airdrop on a pro-rata basis to your Steem customers (no fixed timeline for distribution is required), and the Hive account that corresponds to your Steem account will receive the airdrop at the time of the launch of Hive.

\n\n

NO) If you DO NOT want to participate in the airdrop, please send out a notification to your Steem-holding users to forewarn them that you will not be participating, so that they can temporarily withdraw their Steem from your exchange before the airdrop snapshot date, if they wish to participate in the Hive airdrop.

\n\n

If at all possible, please let us know by Thursday, March 19 if you will or will not be participating in the airdrop.

\n\n

You can notify us by leaving a message on this post using your official Steem account, or you can contact one of our exchange liaisons:

\n\n

Direct follow-ups will be made, but if we do not receive an affirmative response by days end Thursday, March 19th, we will assume that your exchange doesn\u2019t wish to participate in the airdrop for holding customer balances.

\n\n

We're looking forward to the launch of Hive!

\n\n

There's a lot to come during this extremely busy week, and this post and account will be updated to reflect new information as it becomes available.

\n\n
", "canonicalUrl": "https://hive.blog/communityfork/@hiveio/announcing-the-launch-of-hive-blockchain"},{"url": "https://hive.blog/ai/@guiltyparties/rr0c11", "probability": 0.6291452, "headline": "RE: AI-Generated Content = Not Original Content", "datePublished": "2023-02-24T04:24:58.305516", "datePublishedRaw": "2 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://images.hive.blog/DQmdFnUq1sf4nUWzxdE73WzheNcja9NfYsh4yrJve9J134m/fire_small.png", "description": "If you were using Midjourney to make a logo or let's say a title graphic, that makes sense. If you used Midjourney to create a pile of art and then presented it as if you had\u2026 by guiltyparties", "articleBody": "If you were using Midjourney to make a logo or let's say a title graphic, that makes sense. If you used Midjourney to create a pile of art and then presented it as if you had hand-drawn it, that's an entirely different matter.", "articleBodyHtml": "
\n\n

If you were using Midjourney to make a logo or let's say a title graphic, that makes sense. If you used Midjourney to create a pile of art and then presented it as if you had hand-drawn it, that's an entirely different matter.

\n\n
", "canonicalUrl": "https://hive.blog/ai/@guiltyparties/rr0c11"},{"url": "https://hive.blog/hive-112019/@jelly13/rpwrk2", "probability": 0.82248175, "headline": "RE: Novel Voting Mechanism - SPK Network Team Meeting", "datePublished": "2023-02-24T04:24:58.393125", "datePublishedRaw": "2 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://images.hive.blog/DQmYmVSS3CDAC5SgC8527RPFGM1g8KD2Qk4zgdBrQ772w53/jellyPA1.gif", "description": "That's not what I am confused about - I am asking why you differentiate the under 1/20 scenario from the over 1/20 one. Say there is 1M apathy tokens and Alice controls @\u2026 by jelly13", "articleBody": "That's not what I am confused about - I am asking why you differentiate the under 1/20 scenario from the over 1/20 one.\n\nSay there is 1M apathy tokens and Alice controls @ alice.main validator and @ alice.alt non-validator.\n\nIf Alice owns 10k tokens (below 1/20), powered up on @ alice.main, she votes with 20k (main) + 0 (alt). Had she powered up on @ alice.alt, she would vote with 20k (main) + 10k (alt).\n\nObviously, dead zero SPK Power on validator account looks weird, so that's why I added \"as much as possible\" disclaimer.\n\n(I understand that validators and non-validators earn BROCA at the same rate).", "articleBodyHtml": "
\n\n

That's not what I am confused about - I am asking why you differentiate the under 1/20 scenario from the over 1/20 one.

\n\n

Say there is 1M apathy tokens and Alice controls @ alice.main validator and @ alice.alt non-validator.

\n\n

If Alice owns 10k tokens (below 1/20), powered up on @ alice.main, she votes with 20k (main) + 0 (alt). Had she powered up on @ alice.alt, she would vote with 20k (main) + 10k (alt).

\n\n

Obviously, dead zero SPK Power on validator account looks weird, so that's why I added \"as much as possible\" disclaimer.

\n\n

(I understand that validators and non-validators earn BROCA at the same rate).

\n\n
", "canonicalUrl": "https://hive.blog/hive-112019/@jelly13/rpwrk2"},{"url": "https://hive.blog/hive-112019/@jelly13/rpvrh2", "probability": 0.7345367, "headline": "RE: Novel Voting Mechanism - SPK Network Team Meeting", "datePublished": "2023-02-24T04:25:02.121150", "datePublishedRaw": "2 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://images.hive.blog/DQmYmVSS3CDAC5SgC8527RPFGM1g8KD2Qk4zgdBrQ772w53/jellyPA1.gif", "description": "No, my point is actually kinda opposite - I am going to demonstrate that even the most reasonable upper bound still carries the imbalance you described. This will likely be\u2026 by jelly13", "articleBody": "I also agree an upper bound isn't necessary for for many things\n\nNo, my point is actually kinda opposite - I am going to demonstrate that even the most reasonable upper bound still carries the imbalance you described.\n\nThis will likely be 10%... so if the current fee is .005 then you could vote for 0.00495 or 0.00505.\n\nThat is an invitation to split a large stake between multiple accounts. More of a request, actually.\n\n1/20th of of apathy votes.\n\nCool. Now I can formulate my actual question. Do validators always vote with 1/20 of apathy votes regardless of their SPK Power?\n\n1 day seems like a fine place to draw the lower line.. but even a year is a silly upper limit when 30 year bonds are a thing.\n\nI totally agree. This particular variable looks tough to vote on. Logarithmic scale perhaps?", "articleBodyHtml": "
\n\n

I also agree an upper bound isn't necessary for for many things

\n\n

No, my point is actually kinda opposite - I am going to demonstrate that even the most reasonable upper bound still carries the imbalance you described.

\n\n

This will likely be 10%... so if the current fee is .005 then you could vote for 0.00495 or 0.00505.

\n\n

That is an invitation to split a large stake between multiple accounts. More of a request, actually.

\n\n

1/20th of of apathy votes.

\n\n

Cool. Now I can formulate my actual question. Do validators always vote with 1/20 of apathy votes regardless of their SPK Power?

\n\n

1 day seems like a fine place to draw the lower line.. but even a year is a silly upper limit when 30 year bonds are a thing.

\n\n

I totally agree. This particular variable looks tough to vote on. Logarithmic scale perhaps?

\n\n
", "canonicalUrl": "https://hive.blog/hive-112019/@jelly13/rpvrh2"},{"url": "https://hive.blog/hive-112019/@spknetwork/larynx-claimdrop-is-over", "probability": 0.7733962, "headline": "LARYNX Claimdrop is over!", "datePublished": "2023-02-24T04:25:13.971416", "datePublishedRaw": "2 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/spknetwork/243WKRHgDQT1GoeYFDv7Js4wgAVuhKHsE1RKsm4JRmGP8gRuLfSQNWZq9DyFrXHf34rXk.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/243WKRHgDQT1GoeYFDv7Js4wgAVuhKHsE1RKsm4JRmGP8gRuLfSQNWZq9DyFrXHf34rXk.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/AJoKHpwG7xEPGYaaCrPkynaJHYXoF82C9R7HfRCuJVNatWMzaXeRov2jpguJ8LG.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23swiDpewbMmAJxbR61qU25nszQ1PSYtF7LiRUma4DqAFrUAhqckcu27UuW2vAaGdWmQQ.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/thegoliath/23tkdqLBSzq822Jn26Sh4GbxNnU7jhgw6kMcH4iH5BxpoZkjxyapYx9Y5BX3w7XC2MTm3.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tSzLNbhBtmt6UboDLcYm6aqDLZisRQUBZSWLYu2evhUJFZbQE9ZH4vCT4xuQHTqGANQ.png"], "description": "Hello community! One year ago, we announced that the claim drop for LARYNX tokens was going to start on March 20 of 2022. A year has passed, and the Hive community claimed\u2026 by spknetwork", "articleBody": "Hello community!\n\nOne year ago, we announced that the claim drop for LARYNX tokens was going to start on March 20 of 2022. A year has passed, and the Hive community claimed 69,775,870.472 LARYNX tokens. Thank you to everyone that participated in the monthly claim.\n\nYou may have seen a pending balance claim in some frontends. This should be fixed if the frontend or app uses an updated API. This is also a call for all frontends to update their APIs.\n\nSPKCC Monitor\n\n@hivetrending from @pizza.witness. The @hivecreators team is working to improve the site's overall design.\n\nhttps://vote.hive.uno/@threespeak\n\nAbout the SPK Network:\n\nThe SPK Network is a decentralized Web 3.0 protocol that rewards value creators and infrastructure providers appropriately and autonomously by distributing reward tokens so that every user, creator, and platform can earn rewards on a level playing field.", "articleBodyHtml": "
\n\n
\"Claimoverbanner.png\"
\n\n

Hello community!

\n\n

One year ago, we announced that the claim drop for LARYNX tokens was going to start on March 20 of 2022. A year has passed, and the Hive community claimed 69,775,870.472 LARYNX tokens. Thank you to everyone that participated in the monthly claim.

\n\n

You may have seen a pending balance claim in some frontends. This should be fixed if the frontend or app uses an updated API. This is also a call for all frontends to update their APIs.

\n\n

SPKCC Monitor

\n\n
\"image.png\"
\n\n

@hivetrending from @pizza.witness. The @hivecreators team is working to improve the site's overall design.

\n\n
\"image.png\"
\n\n \n\n
\"spknetworklogo01.png\"
\n\n

About the SPK Network:

\n\n

The SPK Network is a decentralized Web 3.0 protocol that rewards value creators and infrastructure providers appropriately and autonomously by distributing reward tokens so that every user, creator, and platform can earn rewards on a level playing field.

\n\n
", "canonicalUrl": "https://peakd.com/hive-112019/@spknetwork/larynx-claimdrop-is-over"},{"url": "https://hive.blog/hive-112019/@spknetwork/wzbjmtil", "probability": 0.5125639, "headline": "Web3 Finance Talk with @taskmaster4450", "datePublished": "2022-11-24T04:25:15.278031", "datePublishedRaw": "5 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafybeigw5x2rhnn7ocuq3xstzxi25i2nl7qd2uuwyf2dn5kem4c4rurndi", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23vhyN5nLFBHCQ2zVeGTyfSUsCag3iCYsDeE7RDwUz2LaWqSAWUP6iJ47DiujWke9UG1r.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png", "https://3speak.tv/embed?v=spknetwork/wzbjmtil"], "description": "\u25b6\ufe0f Watch on 3Speak Space recording with @taskmaster4450. Vote for our Witness: Main Image Source About the SPK\u2026 by spknetwork", "articleBody": "\u25b6\ufe0f Watch on 3Speak", "articleBodyHtml": "", "canonicalUrl": "https://hive.blog/hive-112019/@spknetwork/wzbjmtil"},{"url": "https://hive.blog/hive/@disregardfiat/updates-in-october", "probability": 0.89856726, "headline": "Updates in October", "datePublished": "2022-10-24T04:25:15.095915", "datePublishedRaw": "6 months ago", "author": "disregardfiat", "authorsList": ["disregardfiat"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23t6zyagh772WQp6UysjBBLqS2xABRVKgAWA9kVvZoMWYmWugQJaemhVEiFLDp6HBWUjm.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23t6zyagh772WQp6UysjBBLqS2xABRVKgAWA9kVvZoMWYmWugQJaemhVEiFLDp6HBWUjm.png"], "description": "More furious dev work by disregardfiat", "articleBody": "Time for another update already?\n\nWe're nearing the end of October and like usual I've been a little too busy to write out what I've been up to. I know I'm going to miss some things so feel free to ask some questions.\n\nHive\n\nHardfork 26 came and the talk among the witnesses has been the scheduler bug that is requiring us to have a new Hardfork Tomorrow. I've been blessed by a larger than normal share of blocks, missing zero. I have my signing witness up to date for HF27 and my API witness is doing a little replay.\n\nAPI Witness?!\n\nThat's right! While it won't have the full HAF/HiveMind suite of API it should be a nice place to point Honeycomb Services to. Taking what is currently around 100 getBlock requests off the APIs and lightening their loads. Hopefully you'll find this up and running at hive-api.dlux.io with the above stated limitations.\n\nTrole\n\nI've been building a little proxy server that will allow access to services if a header is appropriately signed by hive keys. This should allow some future improvements to dlux and spk networks...\n\nDLUX\n\ndlux has had a major upgrade thanks to a lot of work from @markegiles. We've got rid of all our legacy php code and are running a 100% client side render vue stack now. While some features are still offline most DLUX network interactions should be availible. NFTs, DEX, etc.\n\nThe only thing that's missing is dApp submission. Which is being worked on with the above trole proxy and our new...\n\nIPFS Gateway\n\nWe're completely off AWS and have a brand new IPFS gateway up. We've switched to a cloudflare DNS provider for a bit of DDoS protection and the cloudflare APIs allow us to automate our dApp SSL security. Our wildcard subdomains ensure the average user of our site won't have cross scripting attacks via dApps. Further security improvements are planned as we roll out our uploader. Stay tuned!\n\nDLUX Token\n\nOur token software has been testing new processes to ensure uptime and patch some inconsistencies. It's more secure than ever. We worked with @mahdiyari to allow us to verify transaction signatures with out asking the Hive-APIs which allows us to build multi-signature transactions in a timely fashion and bringing further security to the multi-sig wallet and along with the stability improvements that brings toward our consensus and healing algorithms.\n\nSPK Network\n\nSPK network is due for it's 1.2 upgrade shortly which will allow people to register services. It also has all the improvements of the DLUX Token. Proof of Access is sailing along at this point. I have all the infrastructure set up at this point to begin testing. I've written contracts for the reconciliation of offline contracts which will let users upload files/videos, ensure they've uploaded the right files, sign the steps off line, and have payments made when appropriate. All in real time. Of course we'll need to do a deep dive into that system so the community can probe it's operation and safety...\n\nProposal 234", "articleBodyHtml": "
\n\n

Time for another update already?

\n\n

We're nearing the end of October and like usual I've been a little too busy to write out what I've been up to. I know I'm going to miss some things so feel free to ask some questions.

\n\n
\"github
\n\n

Hive

\n\n

Hardfork 26 came and the talk among the witnesses has been the scheduler bug that is requiring us to have a new Hardfork Tomorrow. I've been blessed by a larger than normal share of blocks, missing zero. I have my signing witness up to date for HF27 and my API witness is doing a little replay.

\n\n

API Witness?!

\n\n

That's right! While it won't have the full HAF/HiveMind suite of API it should be a nice place to point Honeycomb Services to. Taking what is currently around 100 getBlock requests off the APIs and lightening their loads. Hopefully you'll find this up and running at hive-api.dlux.io with the above stated limitations.

\n\n

Trole

\n\n

I've been building a little proxy server that will allow access to services if a header is appropriately signed by hive keys. This should allow some future improvements to dlux and spk networks...

\n\n

DLUX

\n\n

dlux has had a major upgrade thanks to a lot of work from @markegiles. We've got rid of all our legacy php code and are running a 100% client side render vue stack now. While some features are still offline most DLUX network interactions should be availible. NFTs, DEX, etc.

\n\n

The only thing that's missing is dApp submission. Which is being worked on with the above trole proxy and our new...

\n\n

IPFS Gateway

\n\n

We're completely off AWS and have a brand new IPFS gateway up. We've switched to a cloudflare DNS provider for a bit of DDoS protection and the cloudflare APIs allow us to automate our dApp SSL security. Our wildcard subdomains ensure the average user of our site won't have cross scripting attacks via dApps. Further security improvements are planned as we roll out our uploader. Stay tuned!

\n\n

DLUX Token

\n\n

Our token software has been testing new processes to ensure uptime and patch some inconsistencies. It's more secure than ever. We worked with @mahdiyari to allow us to verify transaction signatures with out asking the Hive-APIs which allows us to build multi-signature transactions in a timely fashion and bringing further security to the multi-sig wallet and along with the stability improvements that brings toward our consensus and healing algorithms.

\n\n

SPK Network

\n\n

SPK network is due for it's 1.2 upgrade shortly which will allow people to register services. It also has all the improvements of the DLUX Token. Proof of Access is sailing along at this point. I have all the infrastructure set up at this point to begin testing. I've written contracts for the reconciliation of offline contracts which will let users upload files/videos, ensure they've uploaded the right files, sign the steps off line, and have payments made when appropriate. All in real time. Of course we'll need to do a deep dive into that system so the community can probe it's operation and safety...

\n\n

Proposal 234

\n\n
", "canonicalUrl": "https://peakd.com/hive/@disregardfiat/updates-in-october"},{"url": "https://hive.blog/spk/@disregardfiat/spk-network-spk-governance", "probability": 0.910886, "headline": "SPK Network - SPK Governance", "datePublished": "2022-10-24T04:25:15.763463", "datePublishedRaw": "6 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/EoEq2J1m5bUvbHBbTr1GYbgEg3GL2mEjV3qeHJ6sNsWik5tntmf4sUC72LrcLYRzcB1.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/EoEq2J1m5bUvbHBbTr1GYbgEg3GL2mEjV3qeHJ6sNsWik5tntmf4sUC72LrcLYRzcB1.png"], "description": "Deep dive into the proposed SPK Governance voting design. Welcoming feedback during our development by disregardfiat", "articleBody": "SPK DPoS\n\nIt's day 2 of blog every day November. I've decided to power up all my posts so I can participate in PowerUpMonth as well. Anyway, on to the things that occupy both my days and dreams...\n\nAround HIVE we often shout about the benefits of Delegated Proof of Stake (DPoS). The SPK Network is built on top of a DPoS chain and will retain as much of the spirit of this paradigm as these conditions allow. The most pressing limitation is we can't as easily force a super majority of participants to have a collectively desired outcome. For example, with 20 accounts controlling a multi-signature wallet, we don't have many options about how the signatures will be arranged. If we set the multi-sig threshold at 17 then it only takes 4 accounts to hold the funds hostage. Which means our multi-signature has to do one of two things. Either set account weights based on stake, or rely on a simple majority to come to a decision.\n\nI prefer keeping the signature holders as even as possible in terms of reward shares and voting rights. Though I would be interested in seeing the competing paradigm play out.\n\nSetting the variables of the overall chain will come down to holders of SPK who have powered up their tokens. In the Hive paradigm you vote for your witnesses, you get 30 votes for 20 spots, and you give up your ability to vote on the price feed, account creation fee, software version, etc... In the proposed SPK paradigm you will retain the ability to vote on every item of governance.\n\nAdding a little balance to this system, the more interested you are in the service layer of SPK (storage, encoding, etc) the more you earn in SPK tokens and the more SPK tokens these service providers earn as well. Distributing the governance at a faster rate than interest only paradigms. Hive uses this style of metric as well in the 50 - 50% split for content moderation.\n\nDesign Considerations\n\nOne of the strengths of a state machine instead of a traditional block chain is it can have a smaller memory footprint. Utilizing the storage inherent in Hive the SPK network can allow voting that doesn't keep a record of actual votes for every account. Rapid replays and low costs are both preferable to storing a set of votes for every account the holds SPK Power.\n\nIn practice this is accomplished by 2 things. Every vote needs to have every parameter one wishes to change. As voting will always be \"open\" the dynamics of the system must allow for variables to move freely. When somebody votes their power amount will be subtracted from the total power amount. Say 10 out of 100. The current variables will then have a 90 to 10 bias when a new average is determined.\n\n(10% * User_with_10%_stake_Variable_Vote) + (90% * Variable_currently) = Variable_new_value\n\nTo prevent one account from voting over and over to push variables toward a goal, the last vote time will be recorded. This will effectively allow only one full sized vote per voting period.\n\nSo if the same account votes in the very next block their effective power will be ~0 and the averages bias will be ~100.\n\nThis paradigm also has the added benefit of keeping variables in a more steady state condition which should encourage platform utilization.\n\nOne further consideration is allowing the cool-down period for several account to reset then making a full sized votes with more accounts. For this reason there is an additional decay parameter to normalize disinterested voters from having a bigger impact when they decide to vote again. For instance, if the cool-down period is 3 months. from 3 months to 6 months the vote power will decrease from 100% to 50%. This will also apply to stake that is newly powered up to prevent certain kinds of governance attacks we've seen in the past.\n\nThe Near Future\n\nOne thing that we worry about is voting apathy, the knowledge of the voters, over-weighted influence, etc... Basically all of the things that any voting paradigm suffers from. Hive has quite a robust system to equalize the top 20 which provides some answer to most of these concerns.\n\nThe SPK Network will have a set of \"Validators\" that need to manage content in it's contracts. These will be voted in to a consensus group. They will likely be some of the biggest accounts in the network, but will also likely have a large discrepancy in actual stake.\n\nWe will keep track of voting apathy by keeping a running sum of votes over the decay period, then distributing those votes (and in practice the stake held by the top 20) equally among those same accounts. So if we have 50% of stake interested in governance, the top 20 will each have a 2.5% vote weight(The other 50% divided by 20). This should cover, knowledge, equality, apathy, and some democracy/republic questions, as any voter who abstains will instantly be proxied evenly among those most knowledgeable.\n\nVotable Variables\n\nHere are the parameters the SPK votes are currently planned to control:\n\nSPK Power Down Interval / Voting Decay\n\nTo maintain system security votes will not be allowed to happen more often from different accounts by powering down from one account and powering up to another. To protect from this vector the Power Down Interval is also the voting decay interval. Presently the power-down time is set to 800000 blocks, which is roughly 4 weeks. I personally would like to see this time increase to roughly 3-4 months. Making the voting cycle a quarterly concern much like traditional corporate bookkeeping cycles. Allowing trends to be analyzed over time and governance strategies to be put forth by members.\n\nLARYNX Power Down Interval\n\nAs this number isn't tied to security a separate time can be used. Something long enough to discourage abuse to the service platform, and short enough to encourage people to power up Larynx and obtain services.\n\nNumber of Runners\n\nHive has a currently limit of 40 keys per account. This sets a hard limit of 79 runners under the current paradigm of giving partial authority to the runners that are most collateralized than simple median (80/2 + 1 > 40). It is currently set to 25 which provides up to 13 key holders. The larger this number is the more secure each wallet is, the irreversibly of blocks, The actual parameter is variable between 25 and 79, while votes are possible from 10 to 94 to allow a little acceleration toward the limit that will prevent 79 or 25 from being reached.\n\nSPK Generation Rates\n\nThese are where SPK comes from. There are three different rates currently 0.1%, 0.015% and 0.01% for LARYNX powered up by a infrastructure provider; delegated to an infrastructure provider(both the delegatee and delegator enjoy this rate), and the lowest for somebody who has powered up their LARYNX but hasn't put it to purpose. These rates will have floating Limits.. 0 to the middle rate, 1 to 5 times the lowest rate, and 2 to 5 times the middle rate. These limits will keep the game theory of operating services balanced. I would also like to see a BROCA rebate for delegators to incentive both good services and lower prices, but these aspects will be outside of consensus. More information\n\nDEX Parameters\n\nThese include the fee percent, slope, and max values discussed here\n\nDAO Claim Percent\n\nFuture Parameters\n\nOnce the Airdrop is over the Service Infrastructure Pools com into play. This is when the SPK network will start moving capital toward goals. This means a whole new set of parameters to control. What the SIP splits it's funds into, how the DAO is utilized, the inflation rate that will best balance capital inflows. etc.\n\nInviting Discussion", "articleBodyHtml": "
\n\n

SPK DPoS

\n\n

It's day 2 of blog every day November. I've decided to power up all my posts so I can participate in PowerUpMonth as well. Anyway, on to the things that occupy both my days and dreams...

\n\n
\"craiyon_123504_a_book_writing_itself.png\"
\n\n

Around HIVE we often shout about the benefits of Delegated Proof of Stake (DPoS). The SPK Network is built on top of a DPoS chain and will retain as much of the spirit of this paradigm as these conditions allow. The most pressing limitation is we can't as easily force a super majority of participants to have a collectively desired outcome. For example, with 20 accounts controlling a multi-signature wallet, we don't have many options about how the signatures will be arranged. If we set the multi-sig threshold at 17 then it only takes 4 accounts to hold the funds hostage. Which means our multi-signature has to do one of two things. Either set account weights based on stake, or rely on a simple majority to come to a decision.

\n\n

I prefer keeping the signature holders as even as possible in terms of reward shares and voting rights. Though I would be interested in seeing the competing paradigm play out.

\n\n

Setting the variables of the overall chain will come down to holders of SPK who have powered up their tokens. In the Hive paradigm you vote for your witnesses, you get 30 votes for 20 spots, and you give up your ability to vote on the price feed, account creation fee, software version, etc... In the proposed SPK paradigm you will retain the ability to vote on every item of governance.

\n\n

Adding a little balance to this system, the more interested you are in the service layer of SPK (storage, encoding, etc) the more you earn in SPK tokens and the more SPK tokens these service providers earn as well. Distributing the governance at a faster rate than interest only paradigms. Hive uses this style of metric as well in the 50 - 50% split for content moderation.

\n\n

Design Considerations

\n\n

One of the strengths of a state machine instead of a traditional block chain is it can have a smaller memory footprint. Utilizing the storage inherent in Hive the SPK network can allow voting that doesn't keep a record of actual votes for every account. Rapid replays and low costs are both preferable to storing a set of votes for every account the holds SPK Power.

\n\n

In practice this is accomplished by 2 things. Every vote needs to have every parameter one wishes to change. As voting will always be \"open\" the dynamics of the system must allow for variables to move freely. When somebody votes their power amount will be subtracted from the total power amount. Say 10 out of 100. The current variables will then have a 90 to 10 bias when a new average is determined.

\n\n
(10% * User_with_10%_stake_Variable_Vote) + (90% * Variable_currently) = Variable_new_value\n
\n\n

To prevent one account from voting over and over to push variables toward a goal, the last vote time will be recorded. This will effectively allow only one full sized vote per voting period.

\n\n

So if the same account votes in the very next block their effective power will be ~0 and the averages bias will be ~100.

\n\n

This paradigm also has the added benefit of keeping variables in a more steady state condition which should encourage platform utilization.

\n\n

One further consideration is allowing the cool-down period for several account to reset then making a full sized votes with more accounts. For this reason there is an additional decay parameter to normalize disinterested voters from having a bigger impact when they decide to vote again. For instance, if the cool-down period is 3 months. from 3 months to 6 months the vote power will decrease from 100% to 50%. This will also apply to stake that is newly powered up to prevent certain kinds of governance attacks we've seen in the past.

\n\n

The Near Future

\n\n

One thing that we worry about is voting apathy, the knowledge of the voters, over-weighted influence, etc... Basically all of the things that any voting paradigm suffers from. Hive has quite a robust system to equalize the top 20 which provides some answer to most of these concerns.

\n\n

The SPK Network will have a set of \"Validators\" that need to manage content in it's contracts. These will be voted in to a consensus group. They will likely be some of the biggest accounts in the network, but will also likely have a large discrepancy in actual stake.

\n\n

We will keep track of voting apathy by keeping a running sum of votes over the decay period, then distributing those votes (and in practice the stake held by the top 20) equally among those same accounts. So if we have 50% of stake interested in governance, the top 20 will each have a 2.5% vote weight(The other 50% divided by 20). This should cover, knowledge, equality, apathy, and some democracy/republic questions, as any voter who abstains will instantly be proxied evenly among those most knowledgeable.

\n\n

Votable Variables

\n\n

Here are the parameters the SPK votes are currently planned to control:

\n\n
SPK Power Down Interval / Voting Decay
\n\n

To maintain system security votes will not be allowed to happen more often from different accounts by powering down from one account and powering up to another. To protect from this vector the Power Down Interval is also the voting decay interval. Presently the power-down time is set to 800000 blocks, which is roughly 4 weeks. I personally would like to see this time increase to roughly 3-4 months. Making the voting cycle a quarterly concern much like traditional corporate bookkeeping cycles. Allowing trends to be analyzed over time and governance strategies to be put forth by members.

\n\n
LARYNX Power Down Interval
\n\n

As this number isn't tied to security a separate time can be used. Something long enough to discourage abuse to the service platform, and short enough to encourage people to power up Larynx and obtain services.

\n\n
Number of Runners
\n\n

Hive has a currently limit of 40 keys per account. This sets a hard limit of 79 runners under the current paradigm of giving partial authority to the runners that are most collateralized than simple median (80/2 + 1 > 40). It is currently set to 25 which provides up to 13 key holders. The larger this number is the more secure each wallet is, the irreversibly of blocks, The actual parameter is variable between 25 and 79, while votes are possible from 10 to 94 to allow a little acceleration toward the limit that will prevent 79 or 25 from being reached.

\n\n
SPK Generation Rates
\n\n

These are where SPK comes from. There are three different rates currently 0.1%, 0.015% and 0.01% for LARYNX powered up by a infrastructure provider; delegated to an infrastructure provider(both the delegatee and delegator enjoy this rate), and the lowest for somebody who has powered up their LARYNX but hasn't put it to purpose. These rates will have floating Limits.. 0 to the middle rate, 1 to 5 times the lowest rate, and 2 to 5 times the middle rate. These limits will keep the game theory of operating services balanced. I would also like to see a BROCA rebate for delegators to incentive both good services and lower prices, but these aspects will be outside of consensus. More information

\n\n
DEX Parameters
\n\n

These include the fee percent, slope, and max values discussed here

\n\n
DAO Claim Percent
\n\n

Future Parameters

\n\n

Once the Airdrop is over the Service Infrastructure Pools com into play. This is when the SPK network will start moving capital toward goals. This means a whole new set of parameters to control. What the SIP splits it's funds into, how the DAO is utilized, the inflation rate that will best balance capital inflows. etc.

\n\n

Inviting Discussion

\n\n
", "canonicalUrl": "https://peakd.com/spk/@disregardfiat/spk-network-spk-governance"},{"url": "https://hive.blog/blog/@disregardfiat/blogging-in-november", "probability": 0.9536791, "headline": "Blogging in November", "datePublished": "2022-10-24T04:25:19.771974", "datePublishedRaw": "6 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/EpvivosGtaHQwdw7LAJtrAz7FjBwrrgKTS1cx5z5nEi4eU8nEEauX8LtFoHHeq6UkVC.jpg", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/EpvivosGtaHQwdw7LAJtrAz7FjBwrrgKTS1cx5z5nEi4eU8nEEauX8LtFoHHeq6UkVC.jpg"], "description": "Blog everyday of November... here we go. by disregardfiat", "articleBody": "National Blogging Month?\n\nI recently saw a tweet by @hivetrending that said November was blogging month. I did a little internet search and confirmed November was a \"national blogging month\" and which ever \"nation\" might be responsible for the edict isn't immediately clear. So here I am, once again trying to blog a little more frequently on this social network that I spend nearly all of my productive time on... though you wouldn't know it from the amount of posts I make.\n\nSo as not to bore anybody I'll try to keep my 30 day challenge to several short topics, 500-1000 words each. Not going to lie, these will most likely be about DAO security, bootstrapping, DEXs, Multi-Signature, etc. But I'll try to get some personal stuff in there as well as a blog or two in Spanish and Portuguese.\n\nFor those that don't want to scroll back 5 years to find my introduction post allow me to reintroduce myself. I'm Steven Ettinger. While I wish that cat wasn't out of the bag with developer arrests in my current field I also doubt I could be any safer as an anon who still carries a phone or sells crypto for fiat. I am an american in the sense I consider the American continents home, and I currently reside in the interior of South America. I speak English and a mixture of Spanish and Portugese that I'm frustratingly aware of as soon as the wrong words leave my mouth. I've been studying Japanese for a long time but can still often understand French easier despite never trying to learn.\n\nIn my introduction to steem hive post I said I believe I'll be in space one day and it appears that eventuality is becoming more and more likely. Apart from learning languages I love to travel, and have been living a nomadic lifestyle most of the time. I haven't stayed in one place longer than a year since I've turned 18, and have visited 20-23 countries depending on how you count a visit. I can hold my own in most videogames, can cook fairly well and I enjoy house and rock music; Deftones being my favorite band.\n\nProfessionally I started as an electronics technician, moved through nuclear power training, operated 4 reactors across 3 platforms. After a growing detest for red-tape I switched fields to radiation therapy and imaging, then caught the crypto bug where I've been toiling on tooling for the past 5 years.\n\nWhat I Do Here\n\nI feel like I dream big and work hard. @markegiles and I have been working toward our DLUX platform for a very long time. It's seen it's ups and downs but there is always forward movement. We are responsible for the first DEX here, which autonomously coordinated escrow transfers, and the following DEX which autonomously controls a multi-signature wallet between multiple parties. We built a publishing paradigm as well as a NFT paradigm. This is all available on dlux.io. I've currently been spending most of my time building the SPK network with the 3speak.tv team. Dlux and 3speak both has a need to store files larger than will fit in blog, and this technology will benefit a larger decentralized ecosystem. Soon I hope this all comes together to a more complete experience across all of these platforms. I've open-sourced my DEX/Token called HoneyComb and there are 3 communities currently represented here: DLUX, SPKCC, and DUAT(Ragnarok).\n\nI run a Hive witness, as well as a bigger beefier backup and limited API node to support the needs of the Honeycomb network. I'm also \"employed\" by Hive through the HoneyComb proposal. Further Witness votes and proposal votes are always welcome in a constantly evolving ecosystem such as this.", "articleBodyHtml": "
\n\n

National Blogging Month?

\n\n

I recently saw a tweet by @hivetrending that said November was blogging month. I did a little internet search and confirmed November was a \"national blogging month\" and which ever \"nation\" might be responsible for the edict isn't immediately clear. So here I am, once again trying to blog a little more frequently on this social network that I spend nearly all of my productive time on... though you wouldn't know it from the amount of posts I make.

\n\n

So as not to bore anybody I'll try to keep my 30 day challenge to several short topics, 500-1000 words each. Not going to lie, these will most likely be about DAO security, bootstrapping, DEXs, Multi-Signature, etc. But I'll try to get some personal stuff in there as well as a blog or two in Spanish and Portuguese.

\n\n
\"@markegiles
\n\n

For those that don't want to scroll back 5 years to find my introduction post allow me to reintroduce myself. I'm Steven Ettinger. While I wish that cat wasn't out of the bag with developer arrests in my current field I also doubt I could be any safer as an anon who still carries a phone or sells crypto for fiat. I am an american in the sense I consider the American continents home, and I currently reside in the interior of South America. I speak English and a mixture of Spanish and Portugese that I'm frustratingly aware of as soon as the wrong words leave my mouth. I've been studying Japanese for a long time but can still often understand French easier despite never trying to learn.

\n\n

In my introduction to steem hive post I said I believe I'll be in space one day and it appears that eventuality is becoming more and more likely. Apart from learning languages I love to travel, and have been living a nomadic lifestyle most of the time. I haven't stayed in one place longer than a year since I've turned 18, and have visited 20-23 countries depending on how you count a visit. I can hold my own in most videogames, can cook fairly well and I enjoy house and rock music; Deftones being my favorite band.

\n\n

Professionally I started as an electronics technician, moved through nuclear power training, operated 4 reactors across 3 platforms. After a growing detest for red-tape I switched fields to radiation therapy and imaging, then caught the crypto bug where I've been toiling on tooling for the past 5 years.

\n\n

What I Do Here

\n\n

I feel like I dream big and work hard. @markegiles and I have been working toward our DLUX platform for a very long time. It's seen it's ups and downs but there is always forward movement. We are responsible for the first DEX here, which autonomously coordinated escrow transfers, and the following DEX which autonomously controls a multi-signature wallet between multiple parties. We built a publishing paradigm as well as a NFT paradigm. This is all available on dlux.io. I've currently been spending most of my time building the SPK network with the 3speak.tv team. Dlux and 3speak both has a need to store files larger than will fit in blog, and this technology will benefit a larger decentralized ecosystem. Soon I hope this all comes together to a more complete experience across all of these platforms. I've open-sourced my DEX/Token called HoneyComb and there are 3 communities currently represented here: DLUX, SPKCC, and DUAT(Ragnarok).

\n\n

I run a Hive witness, as well as a bigger beefier backup and limited API node to support the needs of the Honeycomb network. I'm also \"employed\" by Hive through the HoneyComb proposal. Further Witness votes and proposal votes are always welcome in a constantly evolving ecosystem such as this.

\n\n
", "canonicalUrl": "https://peakd.com/blog/@disregardfiat/blogging-in-november"},{"url": "https://hive.blog/threespeak/@threespeak/introducing-the-acela-core-upgrading-existing-web2-apps-into-true-web3-dapps", "probability": 0.97436506, "headline": "Introducing the Acela Core - Upgrading Existing Web2 Apps into True Web3 Dapps", "datePublished": "2023-02-24T04:25:20.412327", "datePublishedRaw": "2 months ago", "author": "threespeak", "authorsList": ["threespeak"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/eddiespino/23vsWvGpC4kX7inUsuuKMC2PAsYa4gybMmaVpXH9y1MxhFR7FdqYq6yvpPPBnEf7hoZFQ.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/eddiespino/23vsWvGpC4kX7inUsuuKMC2PAsYa4gybMmaVpXH9y1MxhFR7FdqYq6yvpPPBnEf7hoZFQ.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/threespeak/23tHdQDJvtKweD8wKUTxrzhyd2Q7A7GhakcHC6Ffe8bJ6oicN1wXxVjDpDUqN1mdddCBs.png"], "description": "Introducing the Acela Core - Upgrading Existing Web2 Apps into True Web3 Dapps Hello, Community! Welcome to another 3Speak update! In this post will talk\u2026 by threespeak", "articleBody": "Hello, Community!\n\nWelcome to another 3Speak update! In this post will talk about Acela Core, the new video platform backend we are developing for 3Speak.tv and SPK Network apps. As we develop this backend, it will play a critical role in bridging the gap between web2 like platforms and web3 land. We intend for this blog post to be a rough overview as each topic mentioned can be expanded out to dozens of individual posts. Over time we will gradually release more updates and information about what this new backend would entail.\n\nThink of it similar to a layered cake. At the very bottom, the first layer of the cake is HIVE and other web3 tech. Then, one level up is a barrage of indexers to make the base layer data useful (SPK Network union indexer is a perfect example). Then, finally, the icing on the cake, the Acela Core, will go the remaining extra mile to provide platforms will a high end web3 experience without needing to write a ton of backend code.\n\nCreating a platform is hard, you need:\n\nAccount management (authentication)\n\nThe authentication side of the backend combines various authentication and identity systems such as HIVE keychain, metamask, SSO, and more into a single unified backend. This doesn't just apply to 3Speak, but other apps as well wanting to utilize this backend. A portion of this is simply linking your HIVE account to a username/password login and using that as the primary logins means. But also being able to create a HIVE account without needing to know about HIVE at all (web2 like experience). On the more complicated side of the spectrum is HIVE proxy accounts. With this, we can bridge operations from non HIVE accounts onto chain via a set of dedicated proxy accounts maintained by 3Speak or others. While this is a later step in the development process, we are actively looking at building it into the backend.\n\nThe authentication backend would have a simple set of APIs for handling login, posting, voting, account creation, etc. All can be easily accessed via regular HTTP calls with proper authentication token created during the login process.\n\nStorage layer\n\nLarge scale projects involving thousands of files and hundreds of TBs of data tend to require a dedicated service for handling data storage, management of data storage and integrations with other services. To begin with, storage on IPFS isn't easy. You can't just put a file on IPFS and expect everything to work 100% of the time. Thus, we are building a dedicated service to handle uploads, communication with a dedicated IPFS-cluster to manage part of the IPFS storage and a other misc functions. Additionally this storage management service would handle SPK Network POA (Proof of Access) integration at a later date once available.\n\nIndexing\n\nBy default in HIVE, there are only a handful of indexing systems available. Many of which are focused around providing their own forms of social indexing, and all of them do not have support for offchain posting. You can't just execute an API call on any random normal HIVE node and to do a massive complex query. That is why we are building the SPK Network's union indexer to turn HIVE on chain data and most importantly offchain data into a web2 like database. By indexing all this data we can do things like full text search. And we can even do advanced recommendations to improve the trending or user feed. We can mix both on and off chain content into the same feed. As time goes on we will publish more detailed blogs on such topics.\n\nWrite capabilities\n\nCurrently most write operations such as posting a video, commenting, voting and more are done by first granting posting authority to @threespeak then using an API to interact in a web2 like fashion. The API translates incoming requests into on chain actions on the behalf of the user. This gives us lots of flexibility to do things like scheduled posts, login via email/password etc. We will be keeping this largely the same in the new backend, but modernized to significantly usability and overall quality of the backend. Additionally, HIVE proxy accounts will come into play where instead of posting directly to an account on a 1:1 basis, we can directly post onto a shared proxy account for hundreds or even thousands of users. Utilizing the union indexer to differentiate between users.\n\nWe also acknowledge login options exist such as HAS (Hive Authentication Service)/Hive Keychain are available where the user can directly sign for actions. We will be working that into the Acela Core where it fits/is necessary.\n\nVideo Encoding\n\nGenerally speaking, all videos on the 3Speak site need to go through some form of encoding. Whether that reason is to create multiple resolutions of the same video or to reduce the size of the original video. It's widely considered necessary. The new backend will provide an interface to communicate with a dedicated SPK video encoder cluster for video encoding needs. That cluster is either operated by the a platform, or a remote 3rd party of users who form their own encoding clusters. The video encoding part of the backend is heavily tied into the auth, and most importantly storage.\n\nHealth checks\n\nWe will be creating a dedicated service for handling healthchecks of the backend. To start this covers mundane things like an API being down and notifying developers. But also more complicated tasks such as verifying integrity of all stored video content and post metadata. Healthcheck system status will be available on a webpage and through a discord bot for more frequent updates in the future. This will play a critical role during the development process and maturity phase of the backend. Even more so important for platforms that might not have any idea what is going on during the initial phases of setup.\n\nThis is why we are developing the Acela Core for platforms like 3Speak and others to leverage without spending significant time developing a customized backend. It's also entirely open source and able to easily fork and build your own copy. Why build your own web3 infrastructure when the Hive and SPK Network community already provides it for free? Not only will these changes be important to the advancement of the 3Speak.tv platform, but also the entire web3 content and video community.\n\nEnding notes\n\nStay tuned for our upcoming witness blog post that should be available in coming days!\n\nRight-click and open on a new tab to see in full details.\n\nThe above graphic is an in progress and incomplete representation of the new backend architecture.", "articleBodyHtml": "
\n\n

Hello, Community!

\n\n

Welcome to another 3Speak update! In this post will talk about Acela Core, the new video platform backend we are developing for 3Speak.tv and SPK Network apps. As we develop this backend, it will play a critical role in bridging the gap between web2 like platforms and web3 land. We intend for this blog post to be a rough overview as each topic mentioned can be expanded out to dozens of individual posts. Over time we will gradually release more updates and information about what this new backend would entail.

\n\n

Think of it similar to a layered cake. At the very bottom, the first layer of the cake is HIVE and other web3 tech. Then, one level up is a barrage of indexers to make the base layer data useful (SPK Network union indexer is a perfect example). Then, finally, the icing on the cake, the Acela Core, will go the remaining extra mile to provide platforms will a high end web3 experience without needing to write a ton of backend code.

\n\n

Creating a platform is hard, you need:

\n\n
  • Account management (authentication)
\n\n

The authentication side of the backend combines various authentication and identity systems such as HIVE keychain, metamask, SSO, and more into a single unified backend. This doesn't just apply to 3Speak, but other apps as well wanting to utilize this backend. A portion of this is simply linking your HIVE account to a username/password login and using that as the primary logins means. But also being able to create a HIVE account without needing to know about HIVE at all (web2 like experience). On the more complicated side of the spectrum is HIVE proxy accounts. With this, we can bridge operations from non HIVE accounts onto chain via a set of dedicated proxy accounts maintained by 3Speak or others. While this is a later step in the development process, we are actively looking at building it into the backend.

\n\n

The authentication backend would have a simple set of APIs for handling login, posting, voting, account creation, etc. All can be easily accessed via regular HTTP calls with proper authentication token created during the login process.

\n\n
  • Storage layer
\n\n

Large scale projects involving thousands of files and hundreds of TBs of data tend to require a dedicated service for handling data storage, management of data storage and integrations with other services. To begin with, storage on IPFS isn't easy. You can't just put a file on IPFS and expect everything to work 100% of the time. Thus, we are building a dedicated service to handle uploads, communication with a dedicated IPFS-cluster to manage part of the IPFS storage and a other misc functions. Additionally this storage management service would handle SPK Network POA (Proof of Access) integration at a later date once available.

\n\n
  • Indexing
\n\n

By default in HIVE, there are only a handful of indexing systems available. Many of which are focused around providing their own forms of social indexing, and all of them do not have support for offchain posting. You can't just execute an API call on any random normal HIVE node and to do a massive complex query. That is why we are building the SPK Network's union indexer to turn HIVE on chain data and most importantly offchain data into a web2 like database. By indexing all this data we can do things like full text search. And we can even do advanced recommendations to improve the trending or user feed. We can mix both on and off chain content into the same feed. As time goes on we will publish more detailed blogs on such topics.

\n\n
  • Write capabilities
\n\n

Currently most write operations such as posting a video, commenting, voting and more are done by first granting posting authority to @threespeak then using an API to interact in a web2 like fashion. The API translates incoming requests into on chain actions on the behalf of the user. This gives us lots of flexibility to do things like scheduled posts, login via email/password etc. We will be keeping this largely the same in the new backend, but modernized to significantly usability and overall quality of the backend. Additionally, HIVE proxy accounts will come into play where instead of posting directly to an account on a 1:1 basis, we can directly post onto a shared proxy account for hundreds or even thousands of users. Utilizing the union indexer to differentiate between users.

\n\n

We also acknowledge login options exist such as HAS (Hive Authentication Service)/Hive Keychain are available where the user can directly sign for actions. We will be working that into the Acela Core where it fits/is necessary.

\n\n
  • Video Encoding
\n\n

Generally speaking, all videos on the 3Speak site need to go through some form of encoding. Whether that reason is to create multiple resolutions of the same video or to reduce the size of the original video. It's widely considered necessary. The new backend will provide an interface to communicate with a dedicated SPK video encoder cluster for video encoding needs. That cluster is either operated by the a platform, or a remote 3rd party of users who form their own encoding clusters. The video encoding part of the backend is heavily tied into the auth, and most importantly storage.

\n\n
  • Health checks
\n\n

We will be creating a dedicated service for handling healthchecks of the backend. To start this covers mundane things like an API being down and notifying developers. But also more complicated tasks such as verifying integrity of all stored video content and post metadata. Healthcheck system status will be available on a webpage and through a discord bot for more frequent updates in the future. This will play a critical role during the development process and maturity phase of the backend. Even more so important for platforms that might not have any idea what is going on during the initial phases of setup.

\n\n

This is why we are developing the Acela Core for platforms like 3Speak and others to leverage without spending significant time developing a customized backend. It's also entirely open source and able to easily fork and build your own copy. Why build your own web3 infrastructure when the Hive and SPK Network community already provides it for free? Not only will these changes be important to the advancement of the 3Speak.tv platform, but also the entire web3 content and video community.

\n\n

Ending notes

\n\n

Stay tuned for our upcoming witness blog post that should be available in coming days!

\n\n
\"image.png\"
\n\n

Right-click and open on a new tab to see in full details.

\n\n

The above graphic is an in progress and incomplete representation of the new backend architecture.

\n\n
", "canonicalUrl": "https://peakd.com/threespeak/@threespeak/introducing-the-acela-core-upgrading-existing-web2-apps-into-true-web3-dapps"},{"url": "https://hive.blog/hive/@disregardfiat/dlux-dapp-security-deep-dive", "probability": 0.783564, "headline": "DLUX dApp Security Deep Dive", "datePublished": "2022-10-24T04:25:21.107411", "datePublishedRaw": "6 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23tRvMrX4xVmZiU9GsgYQ1eAYKjrwvbJEiQiFPUy1kNp7wJUtuzNnA2FrVpMy1uTFvzDM.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23tRvMrX4xVmZiU9GsgYQ1eAYKjrwvbJEiQiFPUy1kNp7wJUtuzNnA2FrVpMy1uTFvzDM.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23wfmxBxLZG7V7B5EUWySMZugC6w5xj2PvwtsDhjfZ2jnPpAdv1vB7sn3tap2fbDZteqm.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/Eqqvm2VzyWEKjUMEnZ74DwqA1QGce6Z4LbmZy8UDEq67VsRnUuKQ3GvCKeCpLmBbo5f.png"], "description": "Wth a Capture the Flag challenge. by disregardfiat", "articleBody": "Security Through Transparency\n\ndApps on IPFS\n\nAnybody who watches The Lock Picking Lawyer knows that traditional security is provided mostly through obscurity, as he demonstrates with nearly every video that this kind of \"security\" is no security at all.\n\nI've just put the last checks together that I believe makes the dlux dApp distribution scheme as secure as any traditional website. Let's talk about how this works, pose a challenge, and invite some commentary.\n\nArbitrary Code Execution\n\nThe real name of the game is executing arbitrary code in a certain context. Traditional websites have had this exploit thru all sorts of means and if you decide to build a dApp on DLUX you'll have the same expectations of knowledge as any normal website developer. Things that we can't control are obviously out of bounds, such as setting up a phishing website and getting a user to click a link in an email. However, the way dlux dApps are set up it was possible to to send somebody a link, get them to load a page inside of the same sandbox as usual... which would mean exposing that dApps data through cookies, localStorage, and even sessionStorage in some cases. Depending on the nature of the dApp this could mean leaking valuable information.\n\nDLUX dApp Paradigm\n\nIn a few words DLUX dApps should be roughly identical to setting up your own static website, on your own server, with your own SSL certificates. There are numerous ways to accomplish the same thing on today's internet such as github pages. The major differences here are instead of free hosting, Hive has content rewards. Posting an app on github pages like The SPK Network Monitor won't earn you any cryptocurrency and has the possibility of being censored or deleted by Github.\n\nDLUX | Language classes for Ukrainian refugees in VR is a very simple dApp that just displays some 360 images. If I wanted to censor this dApp I could, but the goal of decentralization is to have multiple people run multiple frontends, or even have a local application that can deliver this content no matter the whims of certain individuals.\n\nI hope this paradigm meets or exceeds any other UX for both the developer and the user in terms of speed, trust, security, and usability.\n\nBreaking Down DLUX Security\n\nUsing the above dApp; looking at the domain you will find bezkresu.ipfs.dlux.io. It's set up in such a way that only @bezkresu can post dApps that will run from this domain. Let's find out how this works.\n\nClicking on this app from dlux.io will generate this link:\n\nLinks to dlux.io\n\nThis link will likely benefit from a HEAD request to enable link previews. GET requests will have dlux.io will serve this static file with the following interesting code.\n\nconst author = window.location.pathname.split('/')[2].replace('@', '') const permlink = window.location.pathname.split('/')[3] fetch(\"https://api.hive.blog\") .then((r) => r.json()) .then(res => { stateObj = res.result metadata = stateObj.json_metadata hashy = JSON.parse(metadata).vrHash, vars = `?${location.href.split('?')[1]}` || `?` //... function match (s,t) {var a=[];for(var i=0;i =0){a.push(j);i=j}else return a}} subauthor = match(author,'.').length ? match(author,'.').join('') + author.replace('.', '-') : author ipfsdomain = `https://${subauthor}.ipfs.dlux.io`; location.href = ipfsdomain + `${vars}&hash=${hashy}&author=${author}&permlink=${permlink}&user=${user}`\n\nIt forwards the request to an IPFS enabled subdomain. Probably the hardest thing to understand here is Hive Accounts can have a . in them which would make this subdomain 2 or more subdomains. If @your.app account created a dApp, @your-app could be made to post a dApp that could access @your.app 's subdomain. Since hive accounts can't start with a number, this will be used as a place to index where .'s are replaced with -'s. your-app and 4your-app in this case.\n\nManaging an iFrame\n\nOur IPFS server only has one file to serve. This file checks some signatures indirectly and puts the dApp in an iFrame. Let's see how this works.\n\nThe match function is the same as above. It does it's own check to see if it's on an authorized subdomain before asking a Hive API for the post content. The user following a link to an unqualified domain will get a warning message and no iFrame will be set up.\n\nCaddy Configuration\n\nFinally, to serve anything out of our IPFS's subdomain gateway we've configured Caddy as follows.\n\n*.ipfs.dlux.io { root * /var/www/html/ipfs file_server @ipfs { header Referer https://{labels.3}.ipfs.dlux.io* } handle @ipfs { reverse_proxy /ipfs/* localhost:8080 } tls { dns cloudflare {api-key} } }\n\n*.ipfs.dlux.io handles our wildcard subdomain.\nfile_server serves our one and only file that checks subdomains, and set's up the iFrame sandbox\n@ipfs defines a rule where the referer matches the current subdomain\nhandle @ipfs forwards ipfs/CIDs to the IPFS instance to load out the dApp.\ntls give Caddy the information it needs to keep our SSL certs up to date.\n\nCapture the Flag\n\nI've put a 'secret' in my localStorage.\n\nI'll click on any link posted below. If you can get my secret from my localStorage I'd love to know how. So much in fact that I'll offer a 50,000 DLUX bounty (or 500 Hive).\n\nIf you can think of improvements, I want to know. If you have questions, ask them.\n\nI hope that our sandbox is just as secure as any other website. That phishing out of a non-managed url is the best an attacker can do... and the sandbox only executes code that the author wrote.", "articleBodyHtml": "
\n\n

Security Through Transparency

\n\n

dApps on IPFS

\n\n
\"dApp
\n\n

Anybody who watches The Lock Picking Lawyer knows that traditional security is provided mostly through obscurity, as he demonstrates with nearly every video that this kind of \"security\" is no security at all.

\n\n

I've just put the last checks together that I believe makes the dlux dApp distribution scheme as secure as any traditional website. Let's talk about how this works, pose a challenge, and invite some commentary.

\n\n

Arbitrary Code Execution

\n\n

The real name of the game is executing arbitrary code in a certain context. Traditional websites have had this exploit thru all sorts of means and if you decide to build a dApp on DLUX you'll have the same expectations of knowledge as any normal website developer. Things that we can't control are obviously out of bounds, such as setting up a phishing website and getting a user to click a link in an email. However, the way dlux dApps are set up it was possible to to send somebody a link, get them to load a page inside of the same sandbox as usual... which would mean exposing that dApps data through cookies, localStorage, and even sessionStorage in some cases. Depending on the nature of the dApp this could mean leaking valuable information.

\n\n

DLUX dApp Paradigm

\n\n

In a few words DLUX dApps should be roughly identical to setting up your own static website, on your own server, with your own SSL certificates. There are numerous ways to accomplish the same thing on today's internet such as github pages. The major differences here are instead of free hosting, Hive has content rewards. Posting an app on github pages like The SPK Network Monitor won't earn you any cryptocurrency and has the possibility of being censored or deleted by Github.

\n\n

DLUX | Language classes for Ukrainian refugees in VR is a very simple dApp that just displays some 360 images. If I wanted to censor this dApp I could, but the goal of decentralization is to have multiple people run multiple frontends, or even have a local application that can deliver this content no matter the whims of certain individuals.

\n\n

I hope this paradigm meets or exceeds any other UX for both the developer and the user in terms of speed, trust, security, and usability.

\n\n

Breaking Down DLUX Security

\n\n

Using the above dApp; looking at the domain you will find bezkresu.ipfs.dlux.io. It's set up in such a way that only @bezkresu can post dApps that will run from this domain. Let's find out how this works.

\n\n

Clicking on this app from dlux.io will generate this link:

\n\n
Links to dlux.io
\n\n

This link will likely benefit from a HEAD request to enable link previews. GET requests will have dlux.io will serve this static file with the following interesting code.

\n\n
const author = window.location.pathname.split('/')[2].replace('@', '')\nconst permlink = window.location.pathname.split('/')[3]\nfetch(\"https://api.hive.blog\")\n  .then((r) => r.json())\n  .then(res => {\n    stateObj = res.result\n    metadata = stateObj.json_metadata\n    hashy = JSON.parse(metadata).vrHash,\n    vars = `?${location.href.split('?')[1]}` || `?`\n    //...\n    function match (s,t) {var a=[];for(var i=0;i<s.length;i++){j=s.indexOf(t,i);if(j>=0){a.push(j);i=j}else return a}}\n     subauthor = match(author,'.').length ? match(author,'.').join('') + author.replace('.', '-') : author\n     ipfsdomain = `https://${subauthor}.ipfs.dlux.io`;\n     location.href = ipfsdomain + `${vars}&hash=${hashy}&author=${author}&permlink=${permlink}&user=${user}`\n
\n\n

It forwards the request to an IPFS enabled subdomain. Probably the hardest thing to understand here is Hive Accounts can have a . in them which would make this subdomain 2 or more subdomains. If @your.app account created a dApp, @your-app could be made to post a dApp that could access @your.app's subdomain. Since hive accounts can't start with a number, this will be used as a place to index where .'s are replaced with -'s. your-app and 4your-app in this case.

\n\n
Managing an iFrame
\n\n

Our IPFS server only has one file to serve. This file checks some signatures indirectly and puts the dApp in an iFrame. Let's see how this works.

\n\n

The match function is the same as above. It does it's own check to see if it's on an authorized subdomain before asking a Hive API for the post content. The user following a link to an unqualified domain will get a warning message and no iFrame will be set up.

\n\n
\"Warning\"
\n\n
Caddy Configuration
\n\n

Finally, to serve anything out of our IPFS's subdomain gateway we've configured Caddy as follows.

\n\n
*.ipfs.dlux.io {\n        root * /var/www/html/ipfs\n        file_server\n        @ipfs {\n                header Referer https://{labels.3}.ipfs.dlux.io*\n        }\n        handle @ipfs {\n                reverse_proxy /ipfs/* localhost:8080\n        }\n        tls {\n                dns cloudflare {api-key}\n        }\n}\n
\n\n

*.ipfs.dlux.io handles our wildcard subdomain.
\nfile_server serves our one and only file that checks subdomains, and set's up the iFrame sandbox
\n@ipfs defines a rule where the referer matches the current subdomain
\nhandle @ipfs forwards ipfs/CIDs to the IPFS instance to load out the dApp.
\ntls give Caddy the information it needs to keep our SSL certs up to date.

\n\n

Capture the Flag

\n\n

I've put a 'secret' in my localStorage.

\n\n
\"image.png\"
\n\n

I'll click on any link posted below. If you can get my secret from my localStorage I'd love to know how. So much in fact that I'll offer a 50,000 DLUX bounty (or 500 Hive).

\n\n

If you can think of improvements, I want to know. If you have questions, ask them.

\n\n

I hope that our sandbox is just as secure as any other website. That phishing out of a non-managed url is the best an attacker can do... and the sandbox only executes code that the author wrote.

\n\n
", "canonicalUrl": "https://peakd.com/hive/@disregardfiat/dlux-dapp-security-deep-dive"},{"url": "https://hive.blog/hive-112019/@spknetwork/should-you-lock-or-delegate-your-larynx-tokens", "probability": 0.9801415, "headline": "Should you LOCK or Delegate your LARYNX Tokens?", "datePublished": "2022-08-24T04:25:27.149596", "datePublishedRaw": "8 months ago", "author": "Hey Hivers", "authorsList": ["Hey Hivers"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/spknetwork/23yd1vHmw5sMXkLwzxqXztwFanvs6b1cA8hTNuBpxqP73QCnz3rwGKfF3Ea67vgmuriga.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23yd1vHmw5sMXkLwzxqXztwFanvs6b1cA8hTNuBpxqP73QCnz3rwGKfF3Ea67vgmuriga.png", "https://images.hive.blog/DQmTECsHGr6KzyKw7bMyP6c2TZW25dk6XWSHDZicDrX5TvR/wallet.png"], "description": "Hey Hivers, Following a conversation with some good questions raised by @arcange, the below should help SPK Network supporters make the most of their LARYNX tokens to obtain\u2026 by spknetwork", "articleBody": "Following a conversation with some good questions raised by @arcange, the below should help SPK Network supporters make the most of their LARYNX tokens to obtain SPK Governance tokens. We hope this helps max out your SPK token mining as well as understand a little about the context of why things are being rolled out like they are:\n\nQ1) Hey, it's hard to find good documentation on how best to use/ invest your SPK-related tokens. For example, it is better to gov-lock all your LARYNX tokens as a node operator or keep a few liquid. Does it change something?\n\nFor now, all you can do is delegate LARYNX to node operators on the SPK chain. But as this builds out you will be able to delegate to other infrastructure operators such as CDN, encoder, storage, and validator node operators.\n\nThe central pins of the ecosystem are the validators, who will earn exceptional SPK rewards. All content and files flow through them. They will be elected with SPK via a top 20 vote (maybe more than 20). The more infrastructure they operate in an efficient way that suits the community, the more likely they are to be voted in as top validators.\n\nQ2) OK, but speaking of me as a node operator, should I lock all my LARYNX now?\nIs there any advantage to doing so?\n\nYes. As a node operator, it would make sense to lock all of your LARYNX into gov now if you wanted to max your SPK rewards.\n\nThe idea is that the more infra you operate, the more likely you will be elected as a top validator.\n\nQ3) What's the benefit of doing so?\n\nThis will max your SPK rewards. As @threespeak will not, for example, we also want to delegate to other operators who have genuinely worked on this ecosystem without much reward from the start.\n\nThis means we will not make as many SPK tokens, but more SPK will be spread further. As supporters of the ecosystem, it is also in our interest to make sure that SPK is spread far and wide, so we will both delegate to other node operators as well as stake. Both methods will earn SPK, and delegating will spread it further throughout the community.\n\nQ4) If I have an alt account with LARYNX, is it better to transfer them to me than lock or delegate?\n\nIf you want to max your SPK and are a node operator, send them to your node and lock them in. Later there will be a permanent lock-in option, which will give you a much better return. But you won't get your Larynx back with perma lock-ins. They will recycle to the SIP. By recycling Larynx to the SIP, it will encourage liquidity to flow into the SIP from people buying Larynx miner tokens to improve their mining efficiency.\n\nThe idea is that as you stake or delegate your Larynx, it will decay over time and end up back in the SIP where people can buy it from. This decay of the Larynx tokens back to the SIP rewards node operators, who are adding liquidity to the SIP by buying the latest Larynx miner tokens that come onto the market\n\nPeople are buying Larynx from the SIP lock in these payments permanently to the SIP to create liquidity and DeFi fees, For the community.\n\nAlternatively, we would advise keeping some dry powder Larynx and using it to delegate to other proven node operators who are operating more than just SPK Claim Chain Nodes. I think this will also be seen as favorable in the community. Are you here to earn SPK or here to spread it? A balance of both is good, IMO. Especially if you are a whale and want to ensure distribution is good. And at the same time, incentivize the most dedicated infra operators.\n\nSoon, you will be able to see who is operating which node types, as well as just a chain node. (Chain nodes are not resource intensive as they are great layer two systems where almost everything is on hive layer one - so Speak Claim Chain nodes are super lightweight).\n\nIntensive resources will be when operating encoder nodes, storage, validator, and content deliver nodes. That's where we need to ensure SPK is distributed to encoder nodes operating. Anyone can run one. But they aren't tied into SPK rewards yet.\n\nWe are currently working on CDN and storage and validator nodes. Should have a working system out in 3-4 Months with all node types. The idea is the more of these you run, the more SPK votes you will get. This means you are more likely to be a top 20 validator. That's where some decent rewards are. But ATM, the inflation of SPK is set super low so that not much is distributed.\n\nThe idea is to keep SPK inflation super low for now so that there is an initial distribution. Then when all the infrastructure is out and operating, the inflation of SPK will be ramped up to a Bitcoin-type inflation curve until it reaches its cap in a few decades from now.\n\nThis will mean that the early adopters of SPK and those who have worked hard to run nodes will be the first to vote on the top validators before the inflation starts properly in a few months.\n\nQ5) All of these things are new to us. I suppose it will take some time before everyone understands it and can make good decisions.\n\nWe want a circulation of SPK there to allow the early adopters to vote for their validators once these node types are out. Once anyone can run all infrastructure types, at that point, the initial validators will be elected into position with DPoS SPK token community votes. Then we plan to run a fork to ramp up the inflation of SPK to normal levels.\n\nIt's difficult to do as it's all is in flux, and this early, low SPK inflation phase is temporary. But the path forward is clear.\n\nAbout the SPK Network:", "articleBodyHtml": "
\n\n

Following a conversation with some good questions raised by @arcange, the below should help SPK Network supporters make the most of their LARYNX tokens to obtain SPK Governance tokens. We hope this helps max out your SPK token mining as well as understand a little about the context of why things are being rolled out like they are:

\n\n

Q1) Hey, it's hard to find good documentation on how best to use/ invest your SPK-related tokens. For example, it is better to gov-lock all your LARYNX tokens as a node operator or keep a few liquid. Does it change something?

\n\n

For now, all you can do is delegate LARYNX to node operators on the SPK chain. But as this builds out you will be able to delegate to other infrastructure operators such as CDN, encoder, storage, and validator node operators.

\n\n

The central pins of the ecosystem are the validators, who will earn exceptional SPK rewards. All content and files flow through them. They will be elected with SPK via a top 20 vote (maybe more than 20). The more infrastructure they operate in an efficient way that suits the community, the more likely they are to be voted in as top validators.

\n\n

Q2) OK, but speaking of me as a node operator, should I lock all my LARYNX now?
\nIs there any advantage to doing so?

\n\n

Yes. As a node operator, it would make sense to lock all of your LARYNX into gov now if you wanted to max your SPK rewards.

\n\n

The idea is that the more infra you operate, the more likely you will be elected as a top validator.

\n\n

Q3) What's the benefit of doing so?

\n\n

This will max your SPK rewards. As @threespeak will not, for example, we also want to delegate to other operators who have genuinely worked on this ecosystem without much reward from the start.

\n\n

This means we will not make as many SPK tokens, but more SPK will be spread further. As supporters of the ecosystem, it is also in our interest to make sure that SPK is spread far and wide, so we will both delegate to other node operators as well as stake. Both methods will earn SPK, and delegating will spread it further throughout the community.

\n\n

Q4) If I have an alt account with LARYNX, is it better to transfer them to me than lock or delegate?

\n\n

If you want to max your SPK and are a node operator, send them to your node and lock them in. Later there will be a permanent lock-in option, which will give you a much better return. But you won't get your Larynx back with perma lock-ins. They will recycle to the SIP. By recycling Larynx to the SIP, it will encourage liquidity to flow into the SIP from people buying Larynx miner tokens to improve their mining efficiency.

\n\n

The idea is that as you stake or delegate your Larynx, it will decay over time and end up back in the SIP where people can buy it from. This decay of the Larynx tokens back to the SIP rewards node operators, who are adding liquidity to the SIP by buying the latest Larynx miner tokens that come onto the market

\n\n

People are buying Larynx from the SIP lock in these payments permanently to the SIP to create liquidity and DeFi fees, For the community.

\n\n

Alternatively, we would advise keeping some dry powder Larynx and using it to delegate to other proven node operators who are operating more than just SPK Claim Chain Nodes. I think this will also be seen as favorable in the community. Are you here to earn SPK or here to spread it? A balance of both is good, IMO. Especially if you are a whale and want to ensure distribution is good. And at the same time, incentivize the most dedicated infra operators.

\n\n

Soon, you will be able to see who is operating which node types, as well as just a chain node. (Chain nodes are not resource intensive as they are great layer two systems where almost everything is on hive layer one - so Speak Claim Chain nodes are super lightweight).

\n\n

Intensive resources will be when operating encoder nodes, storage, validator, and content deliver nodes. That's where we need to ensure SPK is distributed to encoder nodes operating. Anyone can run one. But they aren't tied into SPK rewards yet.

\n\n

We are currently working on CDN and storage and validator nodes. Should have a working system out in 3-4 Months with all node types. The idea is the more of these you run, the more SPK votes you will get. This means you are more likely to be a top 20 validator. That's where some decent rewards are. But ATM, the inflation of SPK is set super low so that not much is distributed.

\n\n

The idea is to keep SPK inflation super low for now so that there is an initial distribution. Then when all the infrastructure is out and operating, the inflation of SPK will be ramped up to a Bitcoin-type inflation curve until it reaches its cap in a few decades from now.

\n\n

This will mean that the early adopters of SPK and those who have worked hard to run nodes will be the first to vote on the top validators before the inflation starts properly in a few months.

\n\n

Q5) All of these things are new to us. I suppose it will take some time before everyone understands it and can make good decisions.

\n\n

We want a circulation of SPK there to allow the early adopters to vote for their validators once these node types are out. Once anyone can run all infrastructure types, at that point, the initial validators will be elected into position with DPoS SPK token community votes. Then we plan to run a fork to ramp up the inflation of SPK to normal levels.

\n\n

It's difficult to do as it's all is in flux, and this early, low SPK inflation phase is temporary. But the path forward is clear.

\n\n

About the SPK Network:

\n\n
", "canonicalUrl": "https://peakd.com/hive-112019/@spknetwork/should-you-lock-or-delegate-your-larynx-tokens"},{"url": "https://hive.blog/larynx/@dalz/a-look-at-spk-miner-token-larynx-or-data-on-accounts-claiming-locked-tokens-and-top-holders", "probability": 0.97114277, "headline": "A Look At SPK Miner Token LARYNX | Data on accounts claiming, locked tokens and top holders", "datePublished": "2022-11-24T04:25:27.696420", "datePublishedRaw": "5 months ago", "author": "dalz", "authorsList": ["dalz"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/dalz/23u5ZhGGYFJYbfNW5xDw9hiFtHKpbubym3hRYxrfixxvmGwJLnKiDFYDfk2L9yeZEr3Py.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/dalz/23u5ZhGGYFJYbfNW5xDw9hiFtHKpbubym3hRYxrfixxvmGwJLnKiDFYDfk2L9yeZEr3Py.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/dalz/23tSKXon6bazSikidthrYgAyvBv3c97RxTu74ryeZBgVTwobHyeoKs4PgjTjKYgx1H2KD.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/AJoKXcaSz2CGFrELeWgkhgg6NF6b2p45Rzy1BVcRrLKVBwJjhBD2Ca5xaC5afEG.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/dalz/23tRtVsQUnNsHfXCHv4SE7omwooYvBHpZa3QYB3dmdaLhVGQ4AxgRWGDoWNYxvWwNf3s7.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/dalz/23u5tdnt8gB94BSkRXXmUC57CcCoSakRkxNWRQC91SFoRPaeBCJdeSXxMYCYfEkTTWvqv.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/dalz/23tRtD3bvwGcxoHHYw55SbBf42je43UJSfFKSS4kpP5tGXgHHV25JYMnNmva8iD2JN4BG.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/dalz/23wgFfek9MzS9Yn7rMdFY3VeowYgSx9RPugCzgXD2CneYLSKWHwQm1kBoKiSvynwyQStC.png", "https://images.hive.blog/768x0/https://www.azoreswhalewatch.com/wp-content/uploads/a17.jpg"], "description": "The LARYNX tokens have been available for Hivers to claim it since March 2022. It is distributed to Hive holders via 12 months airdrop. Each month users can claim it. Starting\u2026 by dalz", "articleBody": "The LARYNX tokens have been available for Hivers to claim it since March 2022. It is distributed to Hive holders via 12 months airdrop. Each month users can claim it.\n\nStarting from August a small amount of SPK tokens started to be distributed to account that powered it up or locked as node operators. This has increased the overall participation in the network from users.\n\nThe more techy users can run a node for the claim node where the data is stored for the token. Node runners need locked LARYNX to be in the top validators for the transactions and earn fees in for of the LARYNX token.\nThe ones who don\u2019t run nodes can delegate their tokens to node operators and start earning small amount of the main governance token SPK. The way it works is that both, the delegators and node operators earn a small amount of SPK.\nA note that the SPK rewards at this stage are small. Don\u2019t except to earn a lot when delegating the LARYNX tokens. I\u2019m personally running a claim node and you can delegate LARYNX to me or some of the other node operators on the link https://vue.dlux.io/me#wallet/.\n\nWith this said let\u2019s take a look at some LARYNX data.\nWe will be looking at:\n\nNumber of accounts claiming LARYNX per day\nNumber of accounts claiming LARYNX per month\nDaily locked LARYNX tokens\nCumulative locked LARYNX\nTop accounts that locked LARYNX\n\nThe period that we will be looking at is from March to November 2022.\n\nNumber Of Accounts Claiming LARYNX Per Day\n\nHere is the chart.\n\nWe can notice that the usually at the start of the month there is a spike in the number of accounts claiming. This is due to the fact that users can claim once per month, and as we can see a lot of them are doing it at the first of the month.\nWhen the claiming become available first in March 2022, there was more than 1200 accounts that claimed on the first day and approximately the same amount on the second day. Since then we can see a downtrend for the first of each month up until August. Since August there is an uptrend in the numbers of accounts claiming.\n\nOn a monthly basis the chart looks like this.\n\nThis chart shows a clearer picture. A record high in March, then a drop until August and a growth in the last two months.\nOctober has seen a record high number in accounts claiming, with almost 4.5k accounts claiming the token.\n\nA total of 9.1k accounts have claimed LARYNX token.\n\nI was not able to get data of the claimed tokens per day/month, but until today there is a total of 52M claimed tokens for a period of eight months. This is an average of 6.8M LARYNX tokens claimed monthly, and if we extrapolate for one year, that would be around 80M projection for total claimed LARYNX.\nIf we take in consideration that the available supply of HIVE was around 370M at the moment of snapshot in January 2022, this will be around 22% claimed tokens share. Its quite a low percentage that will leave a lot of unclaimed tokens that the community will later need to decided how to distribute or burn some of them.\n\nDaily Locked LARYNX Tokens For Governance\n\nHere is the chart for daily locked LARYNX.\n\nThe chart includes both powered up and locked LARYNX. The accounts that are running a node can lock the token (white) and the accounts that are not running nodes are powering up (orange) and then delegating to node operators.\n\nWe can see that there is a huge spike in the amount of tokens powered up in August when the option became available. The amount of powered up tokens is now surpassing the amount of locked tokens by node operators, as there is more users who are powering up and delegating.\n\nHere is the chart.\n\nMore than 18M LARYNX is powered/locked for governance.\nThis represents a 35% share from the total of 51M claimed tokens.\nWe can clearly see the uptrend starting from August when the option to power up was introduced and small rewards in SPK as incentives.\n\nTop Accounts That Powered/Locked LARYNX\n\nHere are the top accounts that powered/locked LARYNX.\n\nQuite nice distribution :).\nThe @thecallmedan dedicated account for LARYNX @tcmd-spkcc is on the top with more than 1M tokens followed very closely by @blocktrades. Next is @encrypt3dbr0k3r who is also above 1M, but is not running a node and is delegating to node operators.\n\nAs we can see from the above the number of accounts claiming LARYNX has been significant with 3k to 4k accounts claiming monthly and cumulative 9k accounts. The more owners the token has the better the distribution and the overall tokenomics. On the other hand, the amount of tokens claimed from the available is relatively low with 22% share atm. This means that there are whale accounts that are not into LARYNX and not claiming the token. The exchange accounts are expected to leave behind some tokens, but there are other large accounts that are not claiming.\n\nStarting from August the amount of powered up LARYNX has increased significantly with the SPK rewards going live. The top holders have around 1M tokens powered/locked. The overall distribution seem quite nice at the moment and its reaching almost 10k accounts.\nSmall first steps for the SPK ecosystem but slowly it is taking shape. The task that system is trying to solve is one of the biggest challenges for decentralized online content.\n\n@dalz", "articleBodyHtml": "
\n\n

The LARYNX tokens have been available for Hivers to claim it since March 2022. It is distributed to Hive holders via 12 months airdrop. Each month users can claim it.

\n\n

Starting from August a small amount of SPK tokens started to be distributed to account that powered it up or locked as node operators. This has increased the overall participation in the network from users.

\n\n
\"01.png\"
\n\n

The more techy users can run a node for the claim node where the data is stored for the token. Node runners need locked LARYNX to be in the top validators for the transactions and earn fees in for of the LARYNX token.
\nThe ones who don\u2019t run nodes can delegate their tokens to node operators and start earning small amount of the main governance token SPK. The way it works is that both, the delegators and node operators earn a small amount of SPK.
\nA note that the SPK rewards at this stage are small. Don\u2019t except to earn a lot when delegating the LARYNX tokens. I\u2019m personally running a claim node and you can delegate LARYNX to me or some of the other node operators on the link https://vue.dlux.io/me#wallet/.

\n\n

With this said let\u2019s take a look at some LARYNX data.
\nWe will be looking at:

\n\n
  • Number of accounts claiming LARYNX per day
  • \n
  • Number of accounts claiming LARYNX per month
  • \n
  • Daily locked LARYNX tokens
  • \n
  • Cumulative locked LARYNX
  • \n
  • Top accounts that locked LARYNX
\n\n

The period that we will be looking at is from March to November 2022.

\n\n

Number Of Accounts Claiming LARYNX Per Day

\n\n

Here is the chart.

\n\n
\"image001.png\"
\n\n

We can notice that the usually at the start of the month there is a spike in the number of accounts claiming. This is due to the fact that users can claim once per month, and as we can see a lot of them are doing it at the first of the month.
\nWhen the claiming become available first in March 2022, there was more than 1200 accounts that claimed on the first day and approximately the same amount on the second day. Since then we can see a downtrend for the first of each month up until August. Since August there is an uptrend in the numbers of accounts claiming.

\n\n

On a monthly basis the chart looks like this.

\n\n
\"image003.png\"
\n\n

This chart shows a clearer picture. A record high in March, then a drop until August and a growth in the last two months.
\nOctober has seen a record high number in accounts claiming, with almost 4.5k accounts claiming the token.

\n\n

A total of 9.1k accounts have claimed LARYNX token.

\n\n

I was not able to get data of the claimed tokens per day/month, but until today there is a total of 52M claimed tokens for a period of eight months. This is an average of 6.8M LARYNX tokens claimed monthly, and if we extrapolate for one year, that would be around 80M projection for total claimed LARYNX.
\nIf we take in consideration that the available supply of HIVE was around 370M at the moment of snapshot in January 2022, this will be around 22% claimed tokens share. Its quite a low percentage that will leave a lot of unclaimed tokens that the community will later need to decided how to distribute or burn some of them.

\n\n

Daily Locked LARYNX Tokens For Governance

\n\n

Here is the chart for daily locked LARYNX.

\n\n
\"image005.png\"
\n\n

The chart includes both powered up and locked LARYNX. The accounts that are running a node can lock the token (white) and the accounts that are not running nodes are powering up (orange) and then delegating to node operators.

\n\n

We can see that there is a huge spike in the amount of tokens powered up in August when the option became available. The amount of powered up tokens is now surpassing the amount of locked tokens by node operators, as there is more users who are powering up and delegating.

\n\n

Here is the chart.

\n\n
\"image007.png\"
\n\n

More than 18M LARYNX is powered/locked for governance.
\nThis represents a 35% share from the total of 51M claimed tokens.
\nWe can clearly see the uptrend starting from August when the option to power up was introduced and small rewards in SPK as incentives.

\n\n

Top Accounts That Powered/Locked LARYNX

\n\n

Here are the top accounts that powered/locked LARYNX.

\n\n
\"image009.png\"
\n\n

Quite nice distribution :).
\nThe @thecallmedan dedicated account for LARYNX @tcmd-spkcc is on the top with more than 1M tokens followed very closely by @blocktrades. Next is @encrypt3dbr0k3r who is also above 1M, but is not running a node and is delegating to node operators.

\n\n

As we can see from the above the number of accounts claiming LARYNX has been significant with 3k to 4k accounts claiming monthly and cumulative 9k accounts. The more owners the token has the better the distribution and the overall tokenomics. On the other hand, the amount of tokens claimed from the available is relatively low with 22% share atm. This means that there are whale accounts that are not into LARYNX and not claiming the token. The exchange accounts are expected to leave behind some tokens, but there are other large accounts that are not claiming.

\n\n

Starting from August the amount of powered up LARYNX has increased significantly with the SPK rewards going live. The top holders have around 1M tokens powered/locked. The overall distribution seem quite nice at the moment and its reaching almost 10k accounts.
\nSmall first steps for the SPK ecosystem but slowly it is taking shape. The task that system is trying to solve is one of the biggest challenges for decentralized online content.

\n\n


\n@dalz

\n\n
", "canonicalUrl": "https://peakd.com/larynx/@dalz/a-look-at-spk-miner-token-larynx-or-data-on-accounts-claiming-locked-tokens-and-top-holders"},{"url": "https://hive.blog/hive-112019/@spknetwork/rvsycrqq", "probability": 0.75441635, "headline": "SPK Network AMA Recording", "datePublished": "2022-05-24T04:25:27.827731", "datePublishedRaw": "11 months ago", "author": "spknetwork", "authorsList": ["spknetwork"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafybeifpxt7ndakxng2ivkkljj5wpidowfp7uw4nawtt2fks72c62c2zgy", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23vhz1kKXHHEKh1U8sjgzp2avQh3m9bF4Zk2RTh1YqSWPv1YFEido8K62eddxc2KMt1Xm.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png", "https://3speak.tv/embed?v=spknetwork/rvsycrqq"], "description": "\u25b6\ufe0f Watch on 3Speak Hello Hivers, Today we had a great AMA, this is the recording. Thank you to @disregardfiat for his input and also to @starkerz and\u2026 by spknetwork", "articleBody": "\u25b6\ufe0f Watch on 3Speak", "articleBodyHtml": "", "canonicalUrl": "https://hive.blog/hive-112019/@spknetwork/rvsycrqq"},{"url": "https://hive.blog/hive-112019/@spknetwork/wexlenxm", "probability": 0.67676306, "headline": "How Validators and Storage Nodes will be Rewarded - Asking for Dev Feedback", "datePublished": "2022-06-24T04:25:29.356780", "datePublishedRaw": "10 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafybeidb7w2pkr7gktgmdi53wtucqtsvhmat3zjxiohhpfh5mhqkbla7ce", "images": ["https://3speak.tv/embed?v=spknetwork/wexlenxm", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png"], "description": "\u25b6\ufe0f Watch on 3Speak Join @starkerz and @disregardfiat in this conversation about validators and storage nodes. Storage Contracts: Example:\u2026 by spknetwork", "articleBody": "\u25b6\ufe0f Watch on 3Speak\n\nJoin @starkerz and @disregardfiat in this conversation about validators and storage nodes.\n\nStorage Contracts:\n\n\u201cQmXedNPCLxXxMrSkBjYgMaKy1uCmixiJ2LGdruLfeCsRgs\u201d: { Broca: 1000, //remaining? Can get pretty fuzzy here depending on how Broca is rewarded Bytes: 1024000, Expires: 70000000, BrocaPerCheck: 50, Author: \u201cauthor-a\u201d Validator: \u201caccount-v\u201d, Storage-A: \u201caccount-a\u201d, Storage-B: \u201caccount-b\u201d, Storage-C: \u201caccount-c\u201d, }\n\nValidator Example:\n\n\u201caccount-v\u201d: { domain: \"api.val.io\", self: from, strikes: 0, //what bad actions? branded: 100000, //larynx burned into account Delegated: 100000, Pingaverage: 500, delegators: { \u201cdelegator-a\u201d: 100000 }, contracts: { \u201cQmXedNPCLxXxMrSkBjYgMaKy1uCmixiJ2LGdruLfeCsRgs\u201d: 70000000 }, //not sure if needed here PubKey: \u201cSTM6EUEaEywYoxpeVDX1fPDxrsyQLGTsgYf1LLDSHWwiKBdgRhGrx\u201d, }\n\nStorage Example:\n\n\u201caccount-a\u201d: { domain: \"api.storage.io\", strikes: 0, //what bad actions? branded: 100000, //larynx burned into account Bytes_total: 1024000000 Delegated: 100000, delegators: { \u201cdelegator-a\u201d: 100000 }, contracts: { \u201cQmXedNPCLxXxMrSkBjYgMaKy1uCmixiJ2LGdruLfeCsRgs\u201d: 1024000 }, PubKey: \u201cSTM6EUEaEywYoxpeVDX1fPDxrsyQLGTsgYf1LLDSHWwiKBdgRhGrx\u201d, }\n\nContract/functions:\n\nAdd Validator:\n\nBuilds \u201caccount-v\u201d with larynx. Add Brand: Burns more larynx into an account, to mine for SPK\n\nAdd Storage Node:\n\nBundle: { //from validator node with new files to store \u201cQmXedNPCLxXxMrSkBjYgMaKy1uCmixiJ2LGdruLfeCsRgs\u201d: 1024000 \u201cAuthor\u201d: \u201cauthor-a\u201d } Report: { Checks: { \u201cQmXedNPCLxXxMrSkBjYgMaKy1uCmixiJ2LGdruLfeCsRgs\u201d:{ \u201cStorage-a\u201d: 750, \u201cStorage-b\u201d: 500, \u201cStorage-c\u201d: OFFLINE, }, \u2026 }, OldSecret: mysecret. NewSecrets:{ A: \u201c#mysecret\u201d B: \u201c#mysecret\u201d }\n\nVote for our Witness:", "articleBodyHtml": "
\n\n
\n\n

\u25b6\ufe0f Watch on 3Speak

\n\n


\n\"spkvalidatstonmode.png\"

\n\n

Join @starkerz and @disregardfiat in this conversation about validators and storage nodes.

\n\n

Storage Contracts:

\n\n
\u201cQmXedNPCLxXxMrSkBjYgMaKy1uCmixiJ2LGdruLfeCsRgs\u201d: {\n  Broca: 1000,  //remaining? Can get pretty fuzzy here depending on how Broca is rewarded\n  Bytes: 1024000,\n  Expires: 70000000,\n  BrocaPerCheck: 50,\n  Author: \u201cauthor-a\u201d\n  Validator: \u201caccount-v\u201d,\n  Storage-A: \u201caccount-a\u201d,\n  Storage-B: \u201caccount-b\u201d,\n  Storage-C: \u201caccount-c\u201d,\n}\n
\n\n

Validator Example:

\n\n
\u201caccount-v\u201d: {\n         domain: \"api.val.io\",\n         self: from,\n         strikes: 0, //what bad actions?\n         branded: 100000, //larynx burned into account\n         Delegated: 100000,\n         Pingaverage: 500,\n         delegators: {\n      \u201cdelegator-a\u201d: 100000\n},\n         contracts: {\n \u201cQmXedNPCLxXxMrSkBjYgMaKy1uCmixiJ2LGdruLfeCsRgs\u201d: 70000000\n}, //not sure if needed here\n         PubKey: \u201cSTM6EUEaEywYoxpeVDX1fPDxrsyQLGTsgYf1LLDSHWwiKBdgRhGrx\u201d,\n    }\n
\n\n

Storage Example:

\n\n
\u201caccount-a\u201d: {\n         domain: \"api.storage.io\",\n         strikes: 0, //what bad actions?\n         branded: 100000, //larynx burned into account\n         Bytes_total: 1024000000\n         Delegated: 100000,\n         delegators: {\n      \u201cdelegator-a\u201d: 100000\n},\n         contracts: {\n \u201cQmXedNPCLxXxMrSkBjYgMaKy1uCmixiJ2LGdruLfeCsRgs\u201d: 1024000\n},\n         PubKey: \u201cSTM6EUEaEywYoxpeVDX1fPDxrsyQLGTsgYf1LLDSHWwiKBdgRhGrx\u201d,\n }\n
\n\n

Contract/functions:

\n\n

Add Validator:

\n\n

Builds \u201caccount-v\u201d with larynx. Add Brand: Burns more larynx into an account, to mine for SPK

\n\n

Add Storage Node:

\n\n
Bundle: { //from validator node with new files to store\n    \u201cQmXedNPCLxXxMrSkBjYgMaKy1uCmixiJ2LGdruLfeCsRgs\u201d:  1024000\n    \u201cAuthor\u201d: \u201cauthor-a\u201d\n}\n\nReport: {\n  Checks: {\n   \u201cQmXedNPCLxXxMrSkBjYgMaKy1uCmixiJ2LGdruLfeCsRgs\u201d:{\n     \u201cStorage-a\u201d: 750,\n     \u201cStorage-b\u201d: 500,\n     \u201cStorage-c\u201d: OFFLINE,\n    },\n    \u2026\n  },\n  OldSecret: mysecret.\n  NewSecrets:{\n   A: \u201c#mysecret\u201d\n   B: \u201c#mysecret\u201d\n  }\n
\n\n

Vote for our Witness:

\n\n
", "canonicalUrl": "https://hive.blog/hive-112019/@spknetwork/wexlenxm"},{"url": "https://hive.blog/hive-112019/@spknetwork/usvpezvf", "probability": 0.870293, "headline": "Earn SPK by Delegating LARYNX | Share your Feedback and Review the Code", "datePublished": "2022-06-24T04:25:33.367415", "datePublishedRaw": "10 months ago", "author": "spknetwork", "authorsList": ["spknetwork"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafybeibxzvt3crw4q4n5hthoxojahnla36sminfgald36kppawtyzzdwe4", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23uQpCgjvBAX4ouvFxeGGUhhVYkm8NmaZxAVKcyZvBs5N4QCTrA8sbDbVy5U8JADLvwyv.png", "https://3speak.tv/embed?v=spknetwork/usvpezvf", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png"], "description": "\u25b6\ufe0f Watch on 3Speak We've been humming along for a few months now, it's been mostly smooth but of course there have been a couple of hiccoughs. There have been\u2026 by spknetwork", "articleBody": "\u25b6\ufe0f Watch on 3Speak\n\nWe've been humming along for a few months now, it's been mostly smooth but of course there have been a couple of hiccoughs. There have been more than 50 accounts running node services which provides decentralization as well as capital to collateralize the ad-hoc multi-signature decentralized exchange. The goal of the SPK network is to facilitate decentralized storage which means we have a few stops on the way.\n\nProposing software version 1.1.0\n\nHere we are, for the first time, issuing some SPK tokens. The idea is to trickle out tokens to our users for governance voting so when the time comes to scale, it's not us making all the decisions. Initially we're setting our APY at a very low number so our first movers advantage will be more in the decision making and less in token accumulation.\n\nProposed Rates:\n\n.1% APR to node operators\n.03% APR to Delegators to node operators(split between the delegator and the delegatee, 0.015% each)\n.01% APR to Non-Delegators\n\nThis rate is calculated daily, and is non-compounding, based on Locked, or Powered LARYNX only.\n\nThe incentive is for our users to provide infrastructure, or to select their infrastructure providers. The low amount for the delegation makes it more profitable to provide services than to game the delegation system. The lowest amount goes toward those interested enough to stake into the ecosystem, but not much else.\n\nInterest\n\nThose familiar with the HBD interest algorithms will find some nostalgic methods here. When an account sends SPK, powers up or delegates Larynx, or claims earnings will have their interest calculated first. The periods are based on whole days (28800 blocks). This keeps compute cycles low and makes scaling easier. The down-side is front ends will have to calculate balances.\n\nCode Review\n\nI welcome and strongly encourage code review. I'll explain some of the biggest pieces below.\n\nInterest Calc\n\nconst simpleInterest = (p, t, r) => { const amount = p * (1 + r / 365); const interest = amount - p; return parseInt(interest * t); };\n\np => principal\nt => time in days\nr => rate (0.01, 0.0015, 0.001)\n\nSPK Earnings Calc\n\nconst reward_spk = (acc, bn) => { return new Promise((res, rej) => { const Pblock = getPathNum([\"spkb\", acc]); const Pstats = getPathObj([\"stats\"]); const Ppow = getPathNum([\"pow\", acc]); const Pgranted = getPathNum([\"granted\", acc, \"t\"]); const Pgranting = getPathNum([\"granting\", acc, \"t\"]); const Pgov = getPathNum([\"gov\", acc]); const Pspk = getPathNum(['spk', acc]) const Pspkt = getPathNum(['spk', 't']) Promise.all([Pblock, Pstats, Ppow, Pgranted, Pgranting, Pgov, Pspk, Pspkt]).then( (mem) => { var block = mem[0], diff = bn - block, stats = mem[1], pow = mem[2], granted = mem[3], granting = mem[4], gov = mem[5], spk = mem[6], spkt = mem[7], r = 0, a = 0, b = 0, c = 0, t = 0 if (!block){ store.batch( [{ type: \"put\", path: [\"spkb\", acc], data: bn}], [res, rej, 0] ); } else if(diff < 28800){ //min claim period res(r) } else { t=parseInt(diff/28800) a=simpleInterest(gov, t, stats.spk_rate_lgov) b=simpleInterest(pow, t, stats.spk_rate_lpow); c=simpleInterest( (granted + granting), t, stats.spk_rate_ldel ); const i=a + b + c if(i){ store.batch( [{type: \"put\", path: [\"spk\", acc], data: spk + i}, {type: \"put\", path: [\"spk\", \"t\"], data: spkt + i}, {type: \"put\", path: [\"spkb\", acc], data: bn - (diff % 28800) }], [res, rej, i] ); } else { res(0) } } } ); }) }\n\nHere the different balances are accessed in memory interest is calculated, and the SPK balance and total SPK balance are adusted. The Interest is calculated for whole days stored as block numbers in ['spkb']\n\nSPK Send\n\nexports.spk_send=(json, from, active, pc)=> { let Pinterest = reward_spk(from, json.block_num), Pinterest2 = reward_spk(json.to, json.block_num); Promise.all([Pinterest, Pinterest2]) .then(interest => { let fbalp = getPathNum([\"spk\", from]), tbp = getPathNum([\"spk\", json.to]); //to balance promise Promise.all([fbalp, tbp]) .then((bals) => { let fbal = bals[0], tbal = bals[1], ops = []; send = parseInt(json.amount); if ( json.to && typeof json.to == \"string\" && send > 0 && fbal >= send && active && json.to != from ) { //balance checks ops.push({ type: \"put\", path: [\"spk\", from], data: parseInt(fbal - send), }); ops.push({ type: \"put\", path: [\"spk\", json.to], data: parseInt(tbal + send), }); let msg = `@${from}| Sent @${json.to} ${parseFloat( parseInt(json.amount) / 1000 ).toFixed(3)} SPK`; if (config.hookurl || config.status) postToDiscord(msg, `${json.block_num}:${json.transaction_id}`); ops.push({ type: \"put\", path: [\"feed\", `${json.block_num}:${json.transaction_id}`], data: msg, }); } else { ops.push({ type: \"put\", path: [\"feed\", `${json.block_num}:${json.transaction_id}`], data: `@${from}| Invalid spk send operation`, }); } if (process.env.npm_lifecycle_event == \"test\") pc[2] = ops; store.batch(ops, pc); }) .catch((e) => { console.log(e); }); }) };\n\nHere you can see the interest is calculated and rewarded before the send operation occurs. In the future this will also happen for all smart contracts that rely on SPK balance or changes to locked Larynx balances.\n\nOne concern here, if the approach to voting is space sensitive, changing SPK balances will require vote weights to be returned to the average. If votes are stored in the system the vote can be recalculated. I'm interested in hearing about clever ways to track votes with out keeping a whole accounting of them in memory.\n\nPower Up and Delegate\n\nexports.power_up = (json, from, active, pc) => { reward_spk(from, json.block_num).then(interest => { var amount = parseInt(json.amount), lpp = getPathNum([\"balances\", from]), tpowp = getPathNum([\"pow\", \"t\"]), powp = getPathNum([\"pow\", from]); Promise.all([lpp, tpowp, powp]) .then((bals) => { let lb = bals[0], tpow = bals[1], pow = bals[2], lbal = typeof lb != \"number\" ? 0 : lb, pbal = typeof pow != \"number\" ? 0 : pow, ops = []; if (amount <= lbal && active) { ops.push({ type: \"put\", path: [\"balances\", from], data: lbal - amount, }); ops.push({ type: \"put\", path: [\"pow\", from], data: pbal + amount, }); ops.push({ type: \"put\", path: [\"pow\", \"t\"], data: tpow + amount, }); const msg=`@${from}| Powered ${parseFloat( json.amount / 1000 ).toFixed(3)} ${config.TOKEN}`; if (config.hookurl || config.status) postToDiscord(msg, `${json.block_num}:${json.transaction_id}`); ops.push({ type: \"put\", path: [\"feed\", `${json.block_num}:${json.transaction_id}`], data: msg, }); } else { ops.push({ type: \"put\", path: [\"feed\", `${json.block_num}:${json.transaction_id}`], data: `@${from}| Invalid power up`, }); } store.batch(ops, pc); }) .catch((e)=> { console.log(e); }); }) } exports.power_grant = (json, from, active, pc) => { var amount = parseInt(json.amount), to = json.to, Pgranting_from_total = getPathNum([\"granting\", from, \"t\"]), Pgranting_to_from = getPathNum([\"granting\", from, to]), Pgranted_to_from = getPathNum([\"granted\", to, from]), Pgranted_to_total = getPathNum([\"granted\", to, \"t\"]), Ppower = getPathNum([\"pow\", from]), Pup_from = getPathObj([\"up\", from]), Pdown_from = getPathObj([\"down\", from]), Pup_to = getPathObj([\"up\", to]), Pdown_to = getPathObj([\"down\", to]), Pgov = getPathNum(['gov', to]) Pinterest = reward_spk(from, json.block_num), //interest calc before balance changes. Pinterest2 = reward_spk(json.to, json.block_num); Promise.all([ Ppower, Pgranted_to_from, Pgranted_to_total, Pgranting_to_from, Pgranting_from_total, Pup_from, Pup_to, Pdown_from, Pdown_to, Pgov, Pinterest, Pinterest2 ]) .then((mem) => { let from_power = mem[0], granted_to_from = mem[1], granted_to_total = mem[2], granting_to_from = mem[3], granting_from_total = mem[4], up_from = mem[5], up_to = mem[6], down_from = mem[7], down_to = mem[8], ops = []; if (amount < from_power && amount>= 0 && active && mem[9]) { //mem[9] checks for gov balance in to account. if (amount > granted_to_from) { let more = amount - granted_to_from; if (up_from.max) { up_from.max -= more; } if (down_from.max) { down_from.max -= more; } if (up_to.max) { up_to.max += more; } if (down_to.max) { down_to.max += more; } ops.push({ type: \"put\", path: [\"granting\", from, \"t\"], data: granting_from_total + more, }); ops.push({ type: \"put\", path: [\"granting\", from, to], data: granting_to_from + more, }); ops.push({ type: \"put\", path: [\"granted\", to, from], data: granted_to_from + more, }); ops.push({ type: \"put\", path: [\"granted\", to, \"t\"], data: granted_to_total + more, }); ops.push({ type: \"put\", path: [\"pow\", from], data: from_power - more, }); //weeks wait? chron ops? no because of the power growth at vote ops.push({ type: \"put\", path: [\"up\", from], data: up_from, }); ops.push({ type: \"put\", path: [\"down\", from], data: down_from, }); ops.push({ type: \"put\", path: [\"up\", to], data: up_to }); ops.push({ type: \"put\", path: [\"down\", to], data: down_to, }); const msg = `@${from}| Has granted ${parseFloat( amount / 1000 ).toFixed(3)} to ${to}`; if (config.hookurl || config.status) postToDiscord( msg, `${json.block_num}:${json.transaction_id}` ); ops.push({ type: \"put\", path: [ \"feed\", `${json.block_num}:${json.transaction_id}`, ], data: msg, }); } else if (amount < granted_to_from) { let less=granted_to_from - amount; if (up_from.max) { up_from.max +=less; } if (down_from.max) { down_from.max +=less; } if (up_to.max) { up_to.max -=less; } if (down_to.max) { down_to.max -=less; } ops.push({ type: \"put\", path: [\"granting\", from, \"t\"], data: granting_from_total - less, }); ops.push({ type: \"put\", path: [\"granting\", from, to], data: granting_to_from - less, }); ops.push({ type: \"put\", path: [\"granted\", to, from], data: granted_to_from - less, }); ops.push({ type: \"put\", path: [\"granted\", to, \"t\"], data: granted_to_total - less, }); ops.push({ type: \"put\", path: [\"pow\", from], data: from_power + less, }); ops.push({ type: \"put\", path: [\"up\", from], data: up_from, }); ops.push({ type: \"put\", path: [\"down\", from], data: down_from, }); ops.push({ type: \"put\", path: [\"up\", to], data: up_to }); ops.push({ type: \"put\", path: [\"down\", to], data: down_to, }); const msg=`@${from}| Has granted ${parseFloat( amount / 1000 ).toFixed(3)} to ${to}`; if (config.hookurl || config.status) postToDiscord( msg, `${json.block_num}:${json.transaction_id}` ); ops.push({ type: \"put\", path: [ \"feed\", `${json.block_num}:${json.transaction_id}`, ], data: msg, }); } else { const msg=`@${from}| Has already granted ${parseFloat( amount / 1000 ).toFixed(3)} to ${to}`; if (config.hookurl || config.status) postToDiscord( msg, `${json.block_num}:${json.transaction_id}` ); ops.push({ type: \"put\", path: [ \"feed\", `${json.block_num}:${json.transaction_id}`, ], data: msg, }); } } else { const msg=`@${from}| Invalid delegation`; if (config.hookurl || config.status) postToDiscord( msg, `${json.block_num}:${json.transaction_id}` ); ops.push({ type: \"put\", path: [\"feed\", `${json.block_num}:${json.transaction_id}`], data: msg, }); } store.batch(ops, pc); }) .catch((e)=> { console.log(e); }); }\n\nThe only thing new here to note is delegation are only allowed to accounts with a gov balance, which only node operating accounts can have. Removing or lowering a delegation will also calculate the SPK balance before the change. As far as I can figure there is no way to \"double spend\" Larynx for rewards... please check me on this, it's important.\n\nAPI\n\nStated previously front-ends will have to calculate SPK balances based on the same information, which means a little extra API is needed. This will need to be coupled with the interest rate stats and head block number.\n\nThank You\n\nThank you to the community of node runners and that help me and each other run and improve this and other Hive software. I appreciate your feedback here and on our Discord Server", "articleBodyHtml": "
\n\n
\n\n

\u25b6\ufe0f Watch on 3Speak

\n\n
\n\n

We've been humming along for a few months now, it's been mostly smooth but of course there have been a couple of hiccoughs. There have been more than 50 accounts running node services which provides decentralization as well as capital to collateralize the ad-hoc multi-signature decentralized exchange. The goal of the SPK network is to facilitate decentralized storage which means we have a few stops on the way.

\n\n

Proposing software version 1.1.0

\n\n

Here we are, for the first time, issuing some SPK tokens. The idea is to trickle out tokens to our users for governance voting so when the time comes to scale, it's not us making all the decisions. Initially we're setting our APY at a very low number so our first movers advantage will be more in the decision making and less in token accumulation.

\n\n

Proposed Rates:

\n\n

.1% APR to node operators
\n.03% APR to Delegators to node operators(split between the delegator and the delegatee, 0.015% each)
\n.01% APR to Non-Delegators

\n\n

This rate is calculated daily, and is non-compounding, based on Locked, or Powered LARYNX only.

\n\n

The incentive is for our users to provide infrastructure, or to select their infrastructure providers. The low amount for the delegation makes it more profitable to provide services than to game the delegation system. The lowest amount goes toward those interested enough to stake into the ecosystem, but not much else.

\n\n

Interest

\n\n

Those familiar with the HBD interest algorithms will find some nostalgic methods here. When an account sends SPK, powers up or delegates Larynx, or claims earnings will have their interest calculated first. The periods are based on whole days (28800 blocks). This keeps compute cycles low and makes scaling easier. The down-side is front ends will have to calculate balances.

\n\n

Code Review

\n\n

I welcome and strongly encourage code review. I'll explain some of the biggest pieces below.

\n\n

Interest Calc

\n\n
const simpleInterest = (p, t, r) => { \n  const amount = p * (1 + r / 365);\n  const interest = amount - p;\n  return parseInt(interest * t);\n};\n
\n\n

p => principal
\nt => time in days
\nr => rate (0.01, 0.0015, 0.001)

\n\n

SPK Earnings Calc

\n\n
const reward_spk = (acc, bn) => {\n    return new Promise((res, rej) => {\n        const Pblock = getPathNum([\"spkb\", acc]);\n        const Pstats = getPathObj([\"stats\"]);\n        const Ppow = getPathNum([\"pow\", acc]);\n        const Pgranted = getPathNum([\"granted\", acc, \"t\"]);\n        const Pgranting = getPathNum([\"granting\", acc, \"t\"]);\n        const Pgov = getPathNum([\"gov\", acc]);\n        const Pspk = getPathNum(['spk', acc])\n        const Pspkt = getPathNum(['spk', 't'])\n        Promise.all([Pblock, Pstats, Ppow, Pgranted, Pgranting, Pgov, Pspk, Pspkt]).then(\n            (mem) => {\n                var block = mem[0],\n                    diff = bn - block,\n                    stats = mem[1],\n                    pow = mem[2],\n                    granted = mem[3],\n                    granting = mem[4],\n                    gov = mem[5],\n                    spk = mem[6],\n                    spkt = mem[7],\n                    r = 0, a = 0, b = 0, c = 0, t = 0\n                if (!block){\n                    store.batch(\n                      [{ type: \"put\", path: [\"spkb\", acc], data: bn}],\n                      [res, rej, 0]\n                    );\n                } else if(diff < 28800){ //min claim period\n                    res(r)\n                } else {\n                    t = parseInt(diff/28800)\n                    a = simpleInterest(gov, t, stats.spk_rate_lgov)\n                    b = simpleInterest(pow, t, stats.spk_rate_lpow);\n                    c = simpleInterest(\n                      (granted + granting),\n                      t,\n                      stats.spk_rate_ldel\n                    );\n                    const i = a + b + c\n                    if(i){\n                        store.batch(\n                          [{type: \"put\", path: [\"spk\", acc], data: spk + i}, \n               {type: \"put\", path: [\"spk\", \"t\"], data: spkt + i}, \n               {type: \"put\", path: [\"spkb\", acc], data: bn - (diff % 28800)\n                           }],\n                          [res, rej, i]\n                        );\n                    } else {\n                        res(0)\n                    }\n                }\n\n            }\n        );\n    })\n}\n
\n\n

Here the different balances are accessed in memory interest is calculated, and the SPK balance and total SPK balance are adusted. The Interest is calculated for whole days stored as block numbers in ['spkb']

\n\n

SPK Send

\n\n
exports.spk_send = (json, from, active, pc) => {\n    let Pinterest = reward_spk(from, json.block_num),\n        Pinterest2 = reward_spk(json.to, json.block_num);\n    Promise.all([Pinterest, Pinterest2])\n        .then(interest => {\n            let fbalp = getPathNum([\"spk\", from]),\n                tbp = getPathNum([\"spk\", json.to]); //to balance promise\n            Promise.all([fbalp, tbp])\n                .then((bals) => {\n                    let fbal = bals[0],\n                        tbal = bals[1],\n                        ops = [];\n                    send = parseInt(json.amount);\n                    if (\n                        json.to &&\n                        typeof json.to == \"string\" &&\n                        send > 0 &&\n                        fbal >= send &&\n                        active &&\n                        json.to != from\n                    ) {\n                        //balance checks\n                        ops.push({\n                            type: \"put\",\n                            path: [\"spk\", from],\n                            data: parseInt(fbal - send),\n                        });\n                        ops.push({\n                            type: \"put\",\n                            path: [\"spk\", json.to],\n                            data: parseInt(tbal + send),\n                        });\n                        let msg = `@${from}| Sent @${json.to} ${parseFloat(\n                            parseInt(json.amount) / 1000\n                        ).toFixed(3)} SPK`;\n                        if (config.hookurl || config.status)\n                            postToDiscord(msg, `${json.block_num}:${json.transaction_id}`);\n                        ops.push({\n                            type: \"put\",\n                            path: [\"feed\", `${json.block_num}:${json.transaction_id}`],\n                            data: msg,\n                        });\n                    } else {\n                        ops.push({\n                            type: \"put\",\n                            path: [\"feed\", `${json.block_num}:${json.transaction_id}`],\n                            data: `@${from}| Invalid spk send operation`,\n                        });\n                    }\n                    if (process.env.npm_lifecycle_event == \"test\") pc[2] = ops;\n                    store.batch(ops, pc);\n                })\n                .catch((e) => {\n                    console.log(e);\n                });\n        })\n};\n
\n\n

Here you can see the interest is calculated and rewarded before the send operation occurs. In the future this will also happen for all smart contracts that rely on SPK balance or changes to locked Larynx balances.

\n\n

One concern here, if the approach to voting is space sensitive, changing SPK balances will require vote weights to be returned to the average. If votes are stored in the system the vote can be recalculated. I'm interested in hearing about clever ways to track votes with out keeping a whole accounting of them in memory.

\n\n

Power Up and Delegate

\n\n
exports.power_up = (json, from, active, pc) => {\n    reward_spk(from, json.block_num).then(interest => {\n        var amount = parseInt(json.amount),\n            lpp = getPathNum([\"balances\", from]),\n            tpowp = getPathNum([\"pow\", \"t\"]),\n            powp = getPathNum([\"pow\", from]);\n\n        Promise.all([lpp, tpowp, powp])\n            .then((bals) => {\n                let lb = bals[0],\n                    tpow = bals[1],\n                    pow = bals[2],\n                    lbal = typeof lb != \"number\" ? 0 : lb,\n                    pbal = typeof pow != \"number\" ? 0 : pow,\n                    ops = [];\n                if (amount <= lbal && active) {\n                    ops.push({\n                        type: \"put\",\n                        path: [\"balances\", from],\n                        data: lbal - amount,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"pow\", from],\n                        data: pbal + amount,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"pow\", \"t\"],\n                        data: tpow + amount,\n                    });\n                    const msg = `@${from}| Powered ${parseFloat(\n                        json.amount / 1000\n                    ).toFixed(3)} ${config.TOKEN}`;\n                    if (config.hookurl || config.status)\n                        postToDiscord(msg, `${json.block_num}:${json.transaction_id}`);\n                    ops.push({\n                        type: \"put\",\n                        path: [\"feed\", `${json.block_num}:${json.transaction_id}`],\n                        data: msg,\n                    });\n                } else {\n                    ops.push({\n                        type: \"put\",\n                        path: [\"feed\", `${json.block_num}:${json.transaction_id}`],\n                        data: `@${from}| Invalid power up`,\n                    });\n                }\n                store.batch(ops, pc);\n            })\n            .catch((e) => {\n                console.log(e);\n            });\n    })\n}\n\nexports.power_grant = (json, from, active, pc) => {\n    var amount = parseInt(json.amount),\n        to = json.to,\n        Pgranting_from_total = getPathNum([\"granting\", from, \"t\"]),\n        Pgranting_to_from = getPathNum([\"granting\", from, to]),\n        Pgranted_to_from = getPathNum([\"granted\", to, from]),\n        Pgranted_to_total = getPathNum([\"granted\", to, \"t\"]),\n        Ppower = getPathNum([\"pow\", from]),\n        Pup_from = getPathObj([\"up\", from]),\n        Pdown_from = getPathObj([\"down\", from]),\n        Pup_to = getPathObj([\"up\", to]),\n        Pdown_to = getPathObj([\"down\", to]),\n        Pgov = getPathNum(['gov', to])\n        Pinterest = reward_spk(from, json.block_num), //interest calc before balance changes.\n        Pinterest2 = reward_spk(json.to, json.block_num);\n    Promise.all([\n        Ppower,\n        Pgranted_to_from,\n        Pgranted_to_total,\n        Pgranting_to_from,\n        Pgranting_from_total,\n        Pup_from,\n        Pup_to,\n        Pdown_from,\n        Pdown_to,\n        Pgov,\n        Pinterest,\n        Pinterest2\n    ])\n        .then((mem) => {\n            let from_power = mem[0],\n                granted_to_from = mem[1],\n                granted_to_total = mem[2],\n                granting_to_from = mem[3],\n                granting_from_total = mem[4],\n                up_from = mem[5],\n                up_to = mem[6],\n                down_from = mem[7],\n                down_to = mem[8],\n                ops = [];\n            if (amount < from_power && amount >= 0 && active && mem[9]) { //mem[9] checks for gov balance in to account. \n                if (amount > granted_to_from) {\n                    let more = amount - granted_to_from;\n                    if (up_from.max) {\n                        up_from.max -= more;\n                    }\n                    if (down_from.max) {\n                        down_from.max -= more;\n                    }\n                    if (up_to.max) {\n                        up_to.max += more;\n                    }\n                    if (down_to.max) {\n                        down_to.max += more;\n                    }\n                    ops.push({\n                        type: \"put\",\n                        path: [\"granting\", from, \"t\"],\n                        data: granting_from_total + more,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"granting\", from, to],\n                        data: granting_to_from + more,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"granted\", to, from],\n                        data: granted_to_from + more,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"granted\", to, \"t\"],\n                        data: granted_to_total + more,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"pow\", from],\n                        data: from_power - more,\n                    }); //weeks wait? chron ops? no because of the power growth at vote\n                    ops.push({\n                        type: \"put\",\n                        path: [\"up\", from],\n                        data: up_from,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"down\", from],\n                        data: down_from,\n                    });\n                    ops.push({ type: \"put\", path: [\"up\", to], data: up_to });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"down\", to],\n                        data: down_to,\n                    });\n                    const msg = `@${from}| Has granted ${parseFloat(\n                        amount / 1000\n                    ).toFixed(3)} to ${to}`;\n                    if (config.hookurl || config.status)\n                        postToDiscord(\n                            msg,\n                            `${json.block_num}:${json.transaction_id}`\n                        );\n                    ops.push({\n                        type: \"put\",\n                        path: [\n                            \"feed\",\n                            `${json.block_num}:${json.transaction_id}`,\n                        ],\n                        data: msg,\n                    });\n                } else if (amount < granted_to_from) {\n                    let less = granted_to_from - amount;\n                    if (up_from.max) {\n                        up_from.max += less;\n                    }\n                    if (down_from.max) {\n                        down_from.max += less;\n                    }\n                    if (up_to.max) {\n                        up_to.max -= less;\n                    }\n                    if (down_to.max) {\n                        down_to.max -= less;\n                    }\n                    ops.push({\n                        type: \"put\",\n                        path: [\"granting\", from, \"t\"],\n                        data: granting_from_total - less,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"granting\", from, to],\n                        data: granting_to_from - less,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"granted\", to, from],\n                        data: granted_to_from - less,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"granted\", to, \"t\"],\n                        data: granted_to_total - less,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"pow\", from],\n                        data: from_power + less,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"up\", from],\n                        data: up_from,\n                    });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"down\", from],\n                        data: down_from,\n                    });\n                    ops.push({ type: \"put\", path: [\"up\", to], data: up_to });\n                    ops.push({\n                        type: \"put\",\n                        path: [\"down\", to],\n                        data: down_to,\n                    });\n                    const msg = `@${from}| Has granted ${parseFloat(\n                        amount / 1000\n                    ).toFixed(3)} to ${to}`;\n                    if (config.hookurl || config.status)\n                        postToDiscord(\n                            msg,\n                            `${json.block_num}:${json.transaction_id}`\n                        );\n                    ops.push({\n                        type: \"put\",\n                        path: [\n                            \"feed\",\n                            `${json.block_num}:${json.transaction_id}`,\n                        ],\n                        data: msg,\n                    });\n                } else {\n                    const msg = `@${from}| Has already granted ${parseFloat(\n                        amount / 1000\n                    ).toFixed(3)} to ${to}`;\n                    if (config.hookurl || config.status)\n                        postToDiscord(\n                            msg,\n                            `${json.block_num}:${json.transaction_id}`\n                        );\n                    ops.push({\n                        type: \"put\",\n                        path: [\n                            \"feed\",\n                            `${json.block_num}:${json.transaction_id}`,\n                        ],\n                        data: msg,\n                    });\n                }\n            } else {\n                const msg = `@${from}| Invalid delegation`;\n                if (config.hookurl || config.status)\n                    postToDiscord(\n                        msg,\n                        `${json.block_num}:${json.transaction_id}`\n                    );\n                ops.push({\n                    type: \"put\",\n                    path: [\"feed\", `${json.block_num}:${json.transaction_id}`],\n                    data: msg,\n                });\n            }\n            store.batch(ops, pc);\n        })\n        .catch((e) => {\n            console.log(e);\n        });\n}\n
\n\n

The only thing new here to note is delegation are only allowed to accounts with a gov balance, which only node operating accounts can have. Removing or lowering a delegation will also calculate the SPK balance before the change. As far as I can figure there is no way to \"double spend\" Larynx for rewards... please check me on this, it's important.

\n\n

API

\n\n

Stated previously front-ends will have to calculate SPK balances based on the same information, which means a little extra API is needed. This will need to be coupled with the interest rate stats and head block number.

\n\n

Thank You

\n\n

Thank you to the community of node runners and that help me and each other run and improve this and other Hive software. I appreciate your feedback here and on our Discord Server

\n\n
", "canonicalUrl": "https://hive.blog/hive-112019/@spknetwork/usvpezvf"},{"url": "https://hive.blog/hive-112019/@spknetwork/spk-network-ama-or-larynx-miner-tokens-and-and-spk-tokens", "probability": 0.9384009, "headline": "SPK Network AMA | LARYNX Miner Tokens and & SPK Tokens", "datePublished": "2022-05-24T04:25:33.726843", "datePublishedRaw": "11 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/spknetwork/23tSWPJF7KKkV5Y4BjExn3UJS4zwrggMVS4zPDb2zx8KSCQwtx2HmbjiJSnMaJPu4f7SH.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tSWPJF7KKkV5Y4BjExn3UJS4zwrggMVS4zPDb2zx8KSCQwtx2HmbjiJSnMaJPu4f7SH.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/AJoKXcaSz2CGFrELeWgkhgg6NF6b2p45Rzy1BVcRrLKVBwJjhBD2CaK8xeEKzBX.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23vrpeEjrrEqAoGoqSkP6N2sdHLDkmqbxptkPkEh61HgGMT9X3EnZx2eH1aVhuCRgZKC2.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23uQJUytT4wqtfas5RGTpjS61u73dNGds9S2uBY16WybJbm6WhTEzySVQZFBbfCzHp9ZS.png"], "description": "Hello everyone! We are happy to announce another SPK Network AMA. It will be held in the SPK Network Discord. The date is Monday, May 30, at 9:30 AM Pacific Time (16:30 UTC).\u2026 by spknetwork", "articleBody": "We are happy to announce another SPK Network AMA. It will be held in the SPK Network Discord. The date is Monday, May 30, at 9:30 AM Pacific Time (16:30 UTC). The topic of the AMA is two of the tokens of the SPK Network Ecosystem: LARYNX and SPK.\n\nYou can ask anything about these tokens. Questions related to the utility, future plans or details about the tokens.\n\nThe best questions will receive upvotes! So go ahead and ask away!", "articleBodyHtml": "
\n\n

\"spkamalaspk.png\"

\n\n\n

Hello everyone!

\n\n

We are happy to announce another SPK Network AMA. It will be held in the SPK Network Discord. The date is Monday, May 30, at 9:30 AM Pacific Time (16:30 UTC). The topic of the AMA is two of the tokens of the SPK Network Ecosystem: LARYNX and SPK.

\n\n

You can ask anything about these tokens. Questions related to the utility, future plans or details about the tokens.

\n\n

The best questions will receive upvotes! So go ahead and ask away!

\n\n
\"LARYNX
\n\n
\"SPK
\n\n
", "canonicalUrl": "https://peakd.com/hive-112019/@spknetwork/spk-network-ama-or-larynx-miner-tokens-and-and-spk-tokens"},{"url": "https://hive.blog/dex/@disregardfiat/theory-of-operation-answering-post-launch-feedback", "probability": 0.94684184, "headline": "SPK Claim Chain - Theory of Operation: Answering Post Launch Feedback", "datePublished": "2022-04-24T04:25:38.812440", "datePublishedRaw": "last year", "author": "disregardfiat", "authorsList": ["disregardfiat"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://www.compliancesigns.com/media/NH/workplace-safety/1000/Lifting-Sign-NHE-10030_1000.gif", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23tSz3RGZEtwdrgABpNBoyuZrb25fBaDpqK1pP1ScoDbs6Sx3efPXb73k6JMHDK8qRV92.png"], "description": "SPK Claim Chain. Your questions answered. by disregardfiat", "articleBody": "The Sum of Expectations\n\nYesterdays launch was described as lackluster. Honesty this is the best feedback I could get as a developer. This means that the issues that did present themselves were largely unnoticed by most people. There is a major paradigm we are trying to overcome here, decentralization of infrastructure. This claim drop is our attempt to overcome distribution problems that are intrinsic to more traditional financing. Since the HIVE DAO is largely funding our efforts it of course makes sense to try and give that value back to the people who use Hive and not ourselves. There are no ways the any of the team are able to claim more tokens than any body else who had an account at the time of the snap, it's decentralized from the start.\n\nYou might be wondering now that this claim chain isn't what's in the spk network light paper. So let's find out why, what it is, and what we hope it will become.\n\nBootstraps\n\nPulling yourself up by your bootstraps was a common phrase and computer science loves the term. After all everything you've ever done with a computer can be boiled down to interesting combinations of a single logic gate. When thinking about forming a network of trust there often is a base unit as well: For bitcoin this is the length of the chain. Knowing that Proof of Work is required to build a chain means that I can trust the longest chain. For Proof of Stake systems this unit is the token.\n\nEthereum started out with a bitcoin burn. Proving a value to be converted into stake on Ethereum. SPK is starting with a 1:1 airdrop. The ideals are the same if the execution is slightly different.\n\nSo... Locking Larynx for Governance?\n\nThere are many novel concepts in smart contracting on a chain that doesn't have smart contracts. Effectively we are using a few features Hive does have to build trust in our layer 2 solution. Most notable is Multi-Signature Accounts.\n\nLocking Larynx at this stage is kind of like giving your node / avatar some muscle mass. Central exchanges use the mass of their single holdings to provide liquidity to an asset. Where individuals have to provide it here. Have you ever tried moving a couch with a toddler? It's really easy to do if the toddler is on the couch, and not the one trying to lift it with you; then it's impossible to pick up. This is effectively the paradigm here. Since currently the governance is only trying to do two things we have the current structure:\n\nOnly accounts operating nodes can Lock their governance. This will prevent normal users from accidentally locking tokens for 4 weeks with no benefit. These nodes then have a participation and stake weighted influence on a couple of aspects for running the DEX.\n\nFee: 0 to 1%\nMax Trade: 0 to 100%\nTrade Size Penalty: 0-100%\n\nHow these users get elected to participate is quite easy. Back to our toddler analogy. How many toddlers can outlift Mr Universe? The network selects people who can contribute overall to \"lifting\" the dex transactions. If a group of accounts ever steals the funds in the multisig wallet, they'll have effectively just purchased the orderbook at market rate, plus what ever else is locked in their accounts. The network may have to establish a new multi-sig wallet to become the interface... but nothing should cause overall damage to the ecosystem.\n\nEstablishing SPK Network\n\nThe goals of the network remain what was set out in the light paper. How those goals are accomplished is what we are building toward. How we exchange these tokens will likely remain similar, but the uses of them will be different. For instance, there will probably be several multi-signature wallets, each to manage a trading pair. Being LARYNX, SPK, or BROCA.\n\nThe trust provided by the Proof of Stake network allows us to hold liquidity in the SIP, build and run smart contracts. This is how the features from the light paper will come to fruition. If we set out to build this network on any ecosystem the methods would depend on the ecosystem. Would there have been gas fees? Who pays gas fees? Would users have to set up multiple wallets? etc etc\n\nFor the end users, this method should be the easiest, and also be the cheapest. We have a decentralized network of infrastructure providers to build and manage. HoneyComb seems like the perfect architecture and Hive the perfect home.\n\nProgress\n\nWhen we first started our DEX yesterday the logic was partially refunding amounts in excess of 18 HIVE. Today we can accept orders upto 150 HIVE. Our node operators have purchased LARYNX and locked collateral collectively to improve the network. Like an engine you can't have all oxygen with no fuel or vice versa; the balance here is turning the ignition, warming it up, and eventually building it's performance.\n\nTechnical Aspects\n\nUpdates, Consensus, Operations.\n\nI've done some videos and posts about setting up a node and tried to answer any questions about how and why things are the way they are. HoneyComb | SPKCC | DLUX - Overview and FAQs\n\nLet's examine a DAO Report:\n\nSPKCC Monitor - by @hivetrending\n\nTotal Supply\n\nTotal Tokens Claimed\n\nLocked in Governance\n\nTotal Held For Node Runners to Operate the DEX\n\nLiquid Supply\n\nTotal not in DEX orders or GOV\n\nGovernance Threshold\n\nAmount you have to power up to contribute overall\n\nAssumes only 1 additional account\nCan be lower if many are available (How many toddlers to lift Mr Universe)\n\nDEX Safety Limit\n\nThe collective weight of the poorer half of the Nodes\n\nAssumes you can safely lift the weight of the lightest accounts\n\nDEX Fee\n\nDefaults to 0.5%\nCan be set by node runners up to 1%\n\nDEX Max\n\nThe largest size order that can be placed\nCan be voted by node runners. 1->100%\n\nDEX Slope\n\nControls the size of lower priced orders\n0% means any order can be max size\n100% means orders can be as big as their proportion to the current price\nCan be voted by node runners. 0->100%\n\nMulti-sig Holdings\n\nWhat the DAO believes it holds(It doesn't check with API calls)\n\nBlocks Behind\n\nNode Health. 20 means 1 minute behind the head block. This is healthy\nhttps://spkinstant.hivehoneycomb.com is a special API node that runs real time. Consensus nodes shouldn't be this close to real time.\n\nConsensus / Runners / Total Nodes\n\n25 nodes agree and are real time\n10 nodes have enough GOV to help lift DEX positions, and are real time.\n40 nodes have registered. May be offline or resyncing\n\nThe Election Process\n\nThere are a lot of way things can go wrong or get attacked. The system will automatically abandon non-consensus chains and try to resync with the accounts holding the Multi-Sig keys. To prevent a catastrophic loss of consensus where a bad chain is inserted there is an election process. Every 5 minutes the nodes post their block to IPFS which gives a content based hash. This can be used as a number to determine if the memory states of each node are identical. If enough of the current runners submit the same block there is consensus, and an election happens. The next highest GOV account gets elected and any current runners that missed get dropped. 1 election per 5 minutes. and up to 1/3rd of the runners can be dropped. Also, at least 1 account holding the Multi-sig keys must reach the same consensus. If this fails no election happens, and no change occurs to the runners.\n\nAt the end of every day, the blocks that you were a runner are the \"credited transactions\" and these are evenly paid from the DEX fees. Also the Top 3 gov accounts that had > 90% of the credited transactions will automatically be given the keys to the multi-sig account.\n\nI'm working on a healing algorithm now, as well as a few other tiny fixes. Along with some other improvements from HIVE I believe this can be a strong foundation for decentralized trustless services.", "articleBodyHtml": "
\n\n

The Sum of Expectations

\n\n

Yesterdays launch was described as lackluster. Honesty this is the best feedback I could get as a developer. This means that the issues that did present themselves were largely unnoticed by most people. There is a major paradigm we are trying to overcome here, decentralization of infrastructure. This claim drop is our attempt to overcome distribution problems that are intrinsic to more traditional financing. Since the HIVE DAO is largely funding our efforts it of course makes sense to try and give that value back to the people who use Hive and not ourselves. There are no ways the any of the team are able to claim more tokens than any body else who had an account at the time of the snap, it's decentralized from the start.

\n\n

You might be wondering now that this claim chain isn't what's in the spk network light paper. So let's find out why, what it is, and what we hope it will become.

\n\n

Bootstraps

\n\n

Pulling yourself up by your bootstraps was a common phrase and computer science loves the term. After all everything you've ever done with a computer can be boiled down to interesting combinations of a single logic gate. When thinking about forming a network of trust there often is a base unit as well: For bitcoin this is the length of the chain. Knowing that Proof of Work is required to build a chain means that I can trust the longest chain. For Proof of Stake systems this unit is the token.

\n\n

Ethereum started out with a bitcoin burn. Proving a value to be converted into stake on Ethereum. SPK is starting with a 1:1 airdrop. The ideals are the same if the execution is slightly different.

\n\n

So... Locking Larynx for Governance?

\n\n

There are many novel concepts in smart contracting on a chain that doesn't have smart contracts. Effectively we are using a few features Hive does have to build trust in our layer 2 solution. Most notable is Multi-Signature Accounts.

\n\n
\"2
\n\n

Locking Larynx at this stage is kind of like giving your node / avatar some muscle mass. Central exchanges use the mass of their single holdings to provide liquidity to an asset. Where individuals have to provide it here. Have you ever tried moving a couch with a toddler? It's really easy to do if the toddler is on the couch, and not the one trying to lift it with you; then it's impossible to pick up. This is effectively the paradigm here. Since currently the governance is only trying to do two things we have the current structure:

\n\n

Only accounts operating nodes can Lock their governance. This will prevent normal users from accidentally locking tokens for 4 weeks with no benefit. These nodes then have a participation and stake weighted influence on a couple of aspects for running the DEX.

\n\n
  • Fee: 0 to 1%
  • \n
  • Max Trade: 0 to 100%
  • \n
  • Trade Size Penalty: 0-100%
\n\n

How these users get elected to participate is quite easy. Back to our toddler analogy. How many toddlers can outlift Mr Universe? The network selects people who can contribute overall to \"lifting\" the dex transactions. If a group of accounts ever steals the funds in the multisig wallet, they'll have effectively just purchased the orderbook at market rate, plus what ever else is locked in their accounts. The network may have to establish a new multi-sig wallet to become the interface... but nothing should cause overall damage to the ecosystem.

\n\n

Establishing SPK Network

\n\n

The goals of the network remain what was set out in the light paper. How those goals are accomplished is what we are building toward. How we exchange these tokens will likely remain similar, but the uses of them will be different. For instance, there will probably be several multi-signature wallets, each to manage a trading pair. Being LARYNX, SPK, or BROCA.

\n\n

The trust provided by the Proof of Stake network allows us to hold liquidity in the SIP, build and run smart contracts. This is how the features from the light paper will come to fruition. If we set out to build this network on any ecosystem the methods would depend on the ecosystem. Would there have been gas fees? Who pays gas fees? Would users have to set up multiple wallets? etc etc

\n\n

For the end users, this method should be the easiest, and also be the cheapest. We have a decentralized network of infrastructure providers to build and manage. HoneyComb seems like the perfect architecture and Hive the perfect home.

\n\n

Progress

\n\n
\"Where
\n\n

When we first started our DEX yesterday the logic was partially refunding amounts in excess of 18 HIVE. Today we can accept orders upto 150 HIVE. Our node operators have purchased LARYNX and locked collateral collectively to improve the network. Like an engine you can't have all oxygen with no fuel or vice versa; the balance here is turning the ignition, warming it up, and eventually building it's performance.

\n\n

Technical Aspects

\n\n

Updates, Consensus, Operations.

\n\n

I've done some videos and posts about setting up a node and tried to answer any questions about how and why things are the way they are. HoneyComb | SPKCC | DLUX - Overview and FAQs

\n\n

Let's examine a DAO Report:

\n\n

\"SPKCC

\n\n

SPKCC Monitor - by @hivetrending

\n\n
  • Total Supply\n
      \n
    • Total Tokens Claimed
    • \n
  • \n
  • Locked in Governance\n
      \n
    • Total Held For Node Runners to Operate the DEX
    • \n
  • \n
  • Liquid Supply\n
      \n
    • Total not in DEX orders or GOV
    • \n
  • \n
  • Governance Threshold\n
      \n
    • Amount you have to power up to contribute overall\n
        \n
      • Assumes only 1 additional account
      • \n
      • Can be lower if many are available (How many toddlers to lift Mr Universe)
      • \n
    • \n
  • \n
  • DEX Safety Limit\n
      \n
    • The collective weight of the poorer half of the Nodes\n
        \n
      • Assumes you can safely lift the weight of the lightest accounts
      • \n
    • \n
  • \n
  • DEX Fee\n
      \n
    • Defaults to 0.5%
    • \n
    • Can be set by node runners up to 1%
    • \n
  • \n
  • DEX Max\n
      \n
    • The largest size order that can be placed
    • \n
    • Can be voted by node runners. 1->100%
    • \n
  • \n
  • DEX Slope\n
      \n
    • Controls the size of lower priced orders
    • \n
    • 0% means any order can be max size
    • \n
    • 100% means orders can be as big as their proportion to the current price
    • \n
    • Can be voted by node runners. 0->100%
    • \n
  • \n
  • Multi-sig Holdings\n
      \n
    • What the DAO believes it holds(It doesn't check with API calls)
    • \n
  • \n
  • Blocks Behind\n
      \n
    • Node Health. 20 means 1 minute behind the head block. This is healthy
    • \n
    • https://spkinstant.hivehoneycomb.com is a special API node that runs real time. Consensus nodes shouldn't be this close to real time.
    • \n
  • \n
  • Consensus / Runners / Total Nodes\n
      \n
    • 25 nodes agree and are real time
    • \n
    • 10 nodes have enough GOV to help lift DEX positions, and are real time.
    • \n
    • 40 nodes have registered. May be offline or resyncing
    • \n
\n\n

The Election Process

\n\n

There are a lot of way things can go wrong or get attacked. The system will automatically abandon non-consensus chains and try to resync with the accounts holding the Multi-Sig keys. To prevent a catastrophic loss of consensus where a bad chain is inserted there is an election process. Every 5 minutes the nodes post their block to IPFS which gives a content based hash. This can be used as a number to determine if the memory states of each node are identical. If enough of the current runners submit the same block there is consensus, and an election happens. The next highest GOV account gets elected and any current runners that missed get dropped. 1 election per 5 minutes. and up to 1/3rd of the runners can be dropped. Also, at least 1 account holding the Multi-sig keys must reach the same consensus. If this fails no election happens, and no change occurs to the runners.

\n\n

At the end of every day, the blocks that you were a runner are the \"credited transactions\" and these are evenly paid from the DEX fees. Also the Top 3 gov accounts that had > 90% of the credited transactions will automatically be given the keys to the multi-sig account.

\n\n

I'm working on a healing algorithm now, as well as a few other tiny fixes. Along with some other improvements from HIVE I believe this can be a strong foundation for decentralized trustless services.

\n\n
", "canonicalUrl": "https://peakd.com/dex/@disregardfiat/theory-of-operation-answering-post-launch-feedback"},{"url": "https://hive.blog/hive-112019/@spknetwork/spk-network-tokens-logo-winners", "probability": 0.69389784, "headline": "SPK Network Tokens Logo Winners", "datePublished": "2022-05-24T04:25:40.404431", "datePublishedRaw": "11 months ago", "author": "spknetwork", "authorsList": ["spknetwork"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/spknetwork/23uFRN5wmKSBnetH3zVpHbcabRmdZBBxhJt7dpt6RjLy187Fjw962WTbqg8HVxRis2rXT.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23uFRN5wmKSBnetH3zVpHbcabRmdZBBxhJt7dpt6RjLy187Fjw962WTbqg8HVxRis2rXT.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/AJoKXcaSz2CGFrELeWgkhgg6NF6b2p45Rzy1BVcRrLKVBwJjhBD2CaK8xeEKzBX.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/AKJ5V3xCRq442d5LTf6y15itAYgbM3mrhbsSywXCdT71dY3z6PC7V6jhQ7q2w1f.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23xVnbqFELpmPK6wovexXmCopF4bRrDWgcfBDYxXxb3P8SxsGEA5ZeByLxqPFSDJAkm4X.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23uFwHvSbVV1E2JKbfhV9anamz6s3wWzYVVspJ66HWf6LKLnsoQgEmSbTyMQXVB86WrRg.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23sUVqWPTMvcwBMGtfH2QDqd1cvyP67x2S2Lec53FKBSoZqEUUNqHKzuf49VVSjoSk5fq.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23viTFGeDNsMmh1HNKTr1JPRYqLCYoZjGZjo7A8oL7z6EU32UzHNE2qaRxBs9GZc4QP1e.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/AKPp6YGAgPszmzGFbh53Bj9VATSzY49AJZdAaTBXt1s5sZ8q7hFDAgWxLMYPpWQ.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23uFwDcULFTTjYNC2hTyRGpbMAxjNVnKmLHhQkLM5wYeShqzBu3PDff1keNpwEWaDBcW3.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23uFueZwQhWkgfQjiiYDfzErmzkmVqvvs2vvuvxSXRdJGfabACgyBQhbGvtMftso773zb.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23uFwpPStWHpXm1dcgEUu3StLsgyr7FZ3o376HZ1y2kxzUTNmVZu5XzbvCtMtyjHJUbf7.png"], "description": "Hello everyone! Thank you to all the Hivers that participated in the logo contest . We are aware that the contest ended on April 30th. We took our time to decide because\u2026 by spknetwork", "articleBody": "Hello everyone!\n\nThank you to all the Hivers that participated in the logo contest. We are aware that the contest ended on April 30th. We took our time to decide because there were a lot of good entries. It was a tough decision and not to be taken lightly. We appreciate your effort and contributions.", "articleBodyHtml": "
\n\n
\"spkntokenslogoconwinners.png\"
\n\n

Hello everyone!

\n\n

Thank you to all the Hivers that participated in the logo contest. We are aware that the contest ended on April 30th. We took our time to decide because there were a lot of good entries. It was a tough decision and not to be taken lightly. We appreciate your effort and contributions.

\n\n
", "canonicalUrl": "https://peakd.com/hive-112019/@spknetwork/spk-network-tokens-logo-winners"},{"url": "https://hive.blog/hive-112019/@kameroon/vhyjikzb", "probability": 0.91195333, "headline": "Billionaire Lifestyle In 3 Minutes\ud83d\udcb2 [2023 BILLIONAIRE", "datePublished": "2023-04-23T19:25:43.630507", "datePublishedRaw": "9 hours ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafkreifqnrlx6arrghimneev2xqkvzr4vivi3bsrcoqnzoxyfjlip6y6te", "images": ["https://3speak.tv/embed?v=kameroon/vhyjikzb"], "description": "\u25b6\ufe0f Watch on 3Speak Watching our motivational videos brings you one step further, no matter how far you have progressed on your path yet. Billionaire Lifestyle In 3\u2026 by kameroon", "articleBody": "\u25b6\ufe0f Watch on 3Speak\n\nWatching our motivational videos brings you one step further, no matter how far you have progressed on your path yet.\n\nBillionaire Lifestyle In 3 Minutes\ud83d\udcb2 [2023 BILLIONAIRE MOTIVATION] #1\n\nCozzzy Media\nhttps://youtube.com/@161london\n\nMusic:\nWarriyo - Mortals (feat. Laura Brehm) [NCS Release]\n\nCopyright Disclaimer under Section 107 of the copyright act 1976, allowance is made for fair use for purposes such as criticism, comment, news reporting, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favour of fair use.\n\nThis video has no negative impact on the original works (It actually helps them because we spread their message even further and share them. People are more likely to believe and buy from someone they have seen 100 times than from a person they see for the first time.)\nThis video is used for teaching purposes.\nI only used small pieces of the videos to get the point across where necessary.\n\nThis video was produced for educational and motivational, but also inspirational purposes. We care about reaching as many people as possible and helping them with their motivation. We do not own the videos and music used in this video. If any owners of the content clips would like us to remove their video, we will do so as soon as possible. Just contact us at [email protected]\n\nHow do you as content creator benefit from our video?\n\nWe give credit to your channel - New subscribers, more views and new people inside your ecosystem\nYou help us achieve our goal of inspiring many people and making a better life possible.", "articleBodyHtml": "
\n\n
\n\n

\u25b6\ufe0f Watch on 3Speak

\n\n

Watching our motivational videos brings you one step further, no matter how far you have progressed on your path yet.

\n\n

Billionaire Lifestyle In 3 Minutes\ud83d\udcb2 [2023 BILLIONAIRE MOTIVATION] #1

\n\n

Cozzzy Media
\nhttps://youtube.com/@161london

\n\n

Music:
\nWarriyo - Mortals (feat. Laura Brehm) [NCS Release]

\n\n
\n\n

Copyright Disclaimer under Section 107 of the copyright act 1976, allowance is made for fair use for purposes such as criticism, comment, news reporting, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favour of fair use.

\n\n
  • This video has no negative impact on the original works (It actually helps them because we spread their message even further and share them. People are more likely to believe and buy from someone they have seen 100 times than from a person they see for the first time.)
  • \n
  • This video is used for teaching purposes.
  • \n
  • I only used small pieces of the videos to get the point across where necessary.
\n\n

This video was produced for educational and motivational, but also inspirational purposes. We care about reaching as many people as possible and helping them with their motivation. We do not own the videos and music used in this video. If any owners of the content clips would like us to remove their video, we will do so as soon as possible. Just contact us at [email\u00a0protected]

\n\n

How do you as content creator benefit from our video?

\n\n
  • We give credit to your channel - New subscribers, more views and new people inside your ecosystem
  • \n
  • You help us achieve our goal of inspiring many people and making a better life possible.
\n\n
", "videoUrls": ["https://www.youtube.com/embed/yJg-Y5byMMw?enablejsapi=0&rel=0&origin=https://hive.blog&start=0"], "canonicalUrl": "https://hive.blog/hive-112019/@kameroon/vhyjikzb"},{"url": "https://hive.blog/hive-112019/@spknetwork/spk-network-or-spk-token-going-live-next-monday-call-to-action", "probability": 0.9036639, "headline": "SPK Network | SPK Token Going Live Next Monday - Call to Action", "datePublished": "2022-07-24T04:25:46.419277", "datePublishedRaw": "9 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/spknetwork/23wC7qs95hkZcDZMfEJ8b2em5nrmtNWAFmsrxRac46mEJghAwqzc4TnDLeUibqyh3SKkL.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23wC7qs95hkZcDZMfEJ8b2em5nrmtNWAFmsrxRac46mEJghAwqzc4TnDLeUibqyh3SKkL.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tGVdTy5LrgDgWB982FWd9SHwp8ZhE1qEcDz5ZsAMk7oFiBuTUdo5o7KXZvzQtUf7sMr.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/AKKT8y4oZxRuHtnqaf435VFv974gYYrxNCa8pPq3PpU5UB26rwufTLBpbQ5dpni.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tGVYde2sDDZ43PWLcfxX3R6NycXDpJzi2ogQ5YegnvnMBd4imU6MfARq937mC85Z1XE.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23vrpeEjrrEqAoGoqSkP6N2sdHLDkmqbxptkPkEh61HgGMT9X3EnZx2eH1aVhuCRgZKC2.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/AJpj9ZgmM4hQSW2bLEKXTHWkeFK3GqhVfKCPbdPG2uN7RvMNSfD2Y2peggC9gFi.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/Eo1utBXRsSakafcorW92gWPcf1RLY3k5zagb2UGHXDr55mYADmMJUwgLJEiPhLdSC4Y.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/zottone444/23t7AyKqAfdxKEJPQrpePMW15BCPhbyrf5VoHWxhBFcEcPLjDUVVQAh9ZAopbmoJDekS6.png"], "description": "We are glad to announce that SPK Tokens will go live next Monday, August 8. It will go live at 7:00 PM Pacific Time (Tuesday, August 9, 2:00\u2026 by spknetwork", "articleBody": "We are glad to announce that SPK Tokens will go live next Monday, August 8. It will go live at 7:00 PM Pacific Time (Tuesday, August 9, 2:00 UTC).\n\nThank you all for testing! We have solved a few bugs and are ready to go live!\n\nCall To Action\n\nAfter the token is live, we will host a call to action where users will be able to power up their tokens and delegate them to one of the nodes. You can see the list of the nodes here.\n\nRemember, there are two types of LARYNX, Locked and Power.\n\nLocked is only for the node operators:\n\nPower is for users that want to earn SPK by delegating to one of the node operators:\n\nDecide which node to delegate to. Check out the SPKCC Monitor for that info. Select the node from the drop-down menu.\n\nNote: It may take up to 60 seconds for the changes to update. You can confirm by clicking in magnifying glass icon.\n\nYou are ready to start earning SPK!\n\nMake a Post and Share on Twitter and other Web2 Sites\n\nTo complete the Call to Action create a post about your LARYNX delegation. You can share the step-by-step process and your thoughts and feedback about SPK, LARYNX, and SPK Network.\n\nWe will upvote all eligible posts with the @theycallmedan account and @threespeak (only 3Speak videos).", "articleBodyHtml": "
\n\n
\"spktokenlive.png\"
\n\n

We are glad to announce that SPK Tokens will go live next Monday, August 8. It will go live at 7:00 PM Pacific Time (Tuesday, August 9, 2:00 UTC).

\n\n

Thank you all for testing! We have solved a few bugs and are ready to go live!

\n\n

Call To Action

\n\n

After the token is live, we will host a call to action where users will be able to power up their tokens and delegate them to one of the nodes. You can see the list of the nodes here.

\n\n
\"larynxtoken.png\"
\n\n

Remember, there are two types of LARYNX, Locked and Power.

\n\n

Locked is only for the node operators:

\n\n
\"larynxlocked.png\"
\n\n

Power is for users that want to earn SPK by delegating to one of the node operators:

\n\n
\"larynxpower.png\"
\n\n
  • Decide which node to delegate to. Check out the SPKCC Monitor for that info. Select the node from the drop-down menu.
\n\n
\"delegateL.png\"
\n\n

Note: It may take up to 60 seconds for the changes to update. You can confirm by clicking in magnifying glass icon.

\n\n
  • You are ready to start earning SPK!
\n\n

Make a Post and Share on Twitter and other Web2 Sites

\n\n

To complete the Call to Action create a post about your LARYNX delegation. You can share the step-by-step process and your thoughts and feedback about SPK, LARYNX, and SPK Network.

\n\n

We will upvote all eligible posts with the @theycallmedan account and @threespeak (only 3Speak videos).

\n\n
", "canonicalUrl": "https://peakd.com/hive-112019/@spknetwork/spk-network-or-spk-token-going-live-next-monday-call-to-action"},{"url": "https://hive.blog/hive-112019/@psorigins/ewuhhqmt", "probability": 0.8248895, "headline": "Easily Set up Your Own Web3 Enabled Breakaway Community Platform", "datePublished": "2022-05-24T04:25:49.530131", "datePublishedRaw": "11 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://i.imgur.com/t9Zlv0U.png", "images": ["https://images.hive.blog/768x0/https://i.imgur.com/pBPHCec.png", "https://images.hive.blog/768x0/https://i.imgur.com/t9Zlv0U.png", "https://3speak.tv/embed?v=psorigins/ewuhhqmt"], "description": "\u25b6\ufe0f Watch on 3Speak This video contains all the resources you need to get your own break-away community up and running Discord contact info: break-away community SPK\u2026 by psorigins", "articleBody": "This video contains all the resources you need to get your own break-away community up and running\n\nDiscord contact info: break-away community SPK server channel\n\nRequirements:\n\nServer. (The one used in the video is purchased from Privex WebBox\u2122). You will need your server password, server username and server IP address from the server provider (found in the given email).\nDomain/URL. (The one used in the video is purchased from Go daddy). You will need to input two custom fields in the godaddy configuration site, shown in one of the pictures uploaded below, both having the same value (your server IP address) and the names of @ (base URL) and www (www..com)\nHIVE Community (Here's a perfect tutorial for setting that up HIVE community tutorial)\n\nCommands used in the video:\n\n1. sudo -i 2. curl -SL https://github.com/docker/compose/releases/download/v2.5.0/docker-compose-linux-x86_64 -o /usr/local/bin/docker-compose && sudo chmod +x /usr/local/bin/docker-compose 3. npx @spknetwork/community-create 4. sudo certbot --nginx -d example.com -d www.example.com\n\nCommand use:\n\nUsed for getting root (admin) access to the server\nDownloading docker-compose on the privex server\nRunning the custom cli to create the break-away community\nGetting a SSL (HTTPS) certification for your domain\n\nBreak-away community example (the one made in video) starterkitdao\n\nGodaddy site config", "articleBodyHtml": "
\n\n
\n

\u25b6\ufe0f Watch on 3Speak

\n\n

This video contains all the resources you need to get your own break-away community up and running

\n\n

Discord contact info: break-away community SPK server channel

\n\n

Requirements:

\n\n
  1. Server. (The one used in the video is purchased from Privex WebBox\u2122). You will need your server password, server username and server IP address from the server provider (found in the given email).
  2. \n
  3. Domain/URL. (The one used in the video is purchased from Go daddy). You will need to input two custom fields in the godaddy configuration site, shown in one of the pictures uploaded below, both having the same value (your server IP address) and the names of @ (base URL) and www (www.<your-domain>.com)
  4. \n
  5. HIVE Community (Here's a perfect tutorial for setting that up HIVE community tutorial)
\n\n

Commands used in the video:

\n\n
1. sudo -i \n2. curl -SL https://github.com/docker/compose/releases/download/v2.5.0/docker-compose-linux-x86_64 -o /usr/local/bin/docker-compose && sudo chmod +x /usr/local/bin/docker-compose\n3. npx @spknetwork/community-create\n4. sudo certbot --nginx -d example.com -d www.example.com\n
\n\n

Command use:

\n\n
  1. Used for getting root (admin) access to the server
  2. \n
  3. Downloading docker-compose on the privex server
  4. \n
  5. Running the custom cli to create the break-away community
  6. \n
  7. Getting a SSL (HTTPS) certification for your domain
\n\n

Break-away community example (the one made in video) starterkitdao

\n\n
\n\n

Godaddy site config

\n\n
\n\n
", "canonicalUrl": "https://hive.blog/hive-112019/@psorigins/ewuhhqmt"},{"url": "https://hive.blog/dhf/@disregardfiat/honeycomb-improvements-and-development", "probability": 0.932192, "headline": "Honeycomb - Improvements and Development", "datePublished": "2022-09-24T04:25:57.267327", "datePublishedRaw": "7 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/disregardfiat/23xAxyaEGVSqi2cZu8WVbBdxjFmwg71qPBz3D8rhGDM5VT2KL4rwGcUNUzUp7LjpaTS74.png", "images": ["https://images.hive.blog/0x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23xAxyaEGVSqi2cZu8WVbBdxjFmwg71qPBz3D8rhGDM5VT2KL4rwGcUNUzUp7LjpaTS74.png"], "description": "Request for funding to improve the Honeycomb Layer 2 Software by disregardfiat", "articleBody": "Honeycomb\n\nLast Hivefest I stood on stage and announced the development of Honeycomb and laid out a few features for our roadmap. It's been a little over a year and with the help of a couple of Hive projects we've been able to deliver working versions of most of these features. Honeycomb allows anyone to create a token on Hive using little more than a few resource credits and a block stream. It has a fully decentralized operation that builds a multi-signature wallet to operate a DEX with minimal liquidity requirements. We've operated 3 chains(DLUX, SPK, DUAT-Ragnarok) with marginal errors, that have all lead to more resilient exchange routines, and increased financial security. With HF26 incoming testing has begun on further changes that really bring Honeycomb up to the level required to operate a large network like SPK.\n\nThis paradigm brings censorship resistant smart contracting to HIVE in a big way. And we aren't done yet.\n\nWe would like to petition HIVE to support further development that continues to improve our ecosystem.\n\nDHF Proposal (TL/DR)\n\nAutomated configuration for New Token Ecosystems\nRC Credit Automation to Support Network Traffic\nRC Credit Markets\nAccount Creation Markets\nZK-Rollups for L2 <-> L2 Bridging\nContinue Stability Improvements\nDocumentation Improvements\nRefactor\n\n150 HBD /Day for 1 year.\n\nAutomated configuration for New Token Ecosystems\n\nWe hope to build and maintain easy to use tools to build new token ecosystems. Answer a few questions and generate the 2 files the will turn a Honeycomb fork into a decentralized ecosystem. Config and initial state/token distro. This will include setting up the initial DEX account with multi-sig authorities. All that will be left is for the community to clone the new repo, update the .env with there account and run the architecture.\n\nRC Credit Automation to Support Network Traffic\n\nAfter HF-26 the node accounts that run honeycomb can be even more secured by not having a HP delegation. While custom_json costs are expected to go up, all transactions may have additional RC costs associated with multi-sig verification. This leads to conditions that must be monitored to ensure the parties to an ecosystem can maintain their ability to send custom-Jsons as well as transfers from the DEX. This can and should be automated.\n\nRC Credit Markets\n\nAdditionally RC markets can be created for much broader purpose and honeycomb is certainly a vehicle to build them.\n\nAccount Creation Markets\n\nNot a single provider of free accounts hasn't been taken advantage of. Having a market for ACT(Account Creation Tokens) that can be claimed and sold with-in the current rules for ACTs would be hard to impossible on other Hive paradigms. Utilizing the already decentralized nature of the nodes on Honeycomb, accounts that have ACTs could be automated and their ACTs sold at market value with some controls like minimum price.\n\n~ZK-Rollups for L2 <-> L2 Bridging\n\nBuild on similar code to bridge different honeycomb ecosystems thru use of Hive Escrow Transactions and Zero-Knowledge Proofs(Better and faster than similar oracle solutions).\n\nContinue Stability Improvements\n\nSoftware is always moving an improving, keeping up to date with changes and implementing fixes is a must for any far reaching architecture.\n\nDocumentation Improvements\n\nRelease documents and guides to help ecosystems take advantage of NFT markets, ICOs, smart contracts, and more.\n\nRefactor\n\nImprove the base code readability so it's easier for the community to get involved and improve the ecosystem with further features.\n\nCosts\n\nWhile server costs and other things are usually a concern I believe that each community should be able to incentive their node runners and API services internally. To that effect I'm only asking for $150/day to work on the outlined base layer architecture while maintaining significant time to aid communities in their own development. I believe services like this would normally pay in the $100/hr range, but as this is a project I firmly believe in and I don't require the biggest paychecks $150/day seems to be more than fair to keep the lights on. I want to utilize the DHF because I believe value to the overall community will be much higher than this payout and all community members should benefit from this proposal thru price action of those that intend to use these features.\n\nTrack Record\n\nWith over 6 years on the blockchain I have been working here exclusively for the last 4. I've brought several concepts to life from DLUX, SPK, DUAT to HashKings v1. My github is chock-full of concepts brought to life, and my posting history has discussions in a wide range of industry knowledge. I run a witness and a backup for ensured up time, where I'm currently in the top 50. (You can help improve this standing as well with a vote). I've also increased my stake to over $20k in Hive denominated assets with several thousand more on Honeycomb assets.\n\nVoting\n\nFinally", "articleBodyHtml": "
\n\n

Honeycomb

\n\n
\n\n

Last Hivefest I stood on stage and announced the development of Honeycomb and laid out a few features for our roadmap. It's been a little over a year and with the help of a couple of Hive projects we've been able to deliver working versions of most of these features. Honeycomb allows anyone to create a token on Hive using little more than a few resource credits and a block stream. It has a fully decentralized operation that builds a multi-signature wallet to operate a DEX with minimal liquidity requirements. We've operated 3 chains(DLUX, SPK, DUAT-Ragnarok) with marginal errors, that have all lead to more resilient exchange routines, and increased financial security. With HF26 incoming testing has begun on further changes that really bring Honeycomb up to the level required to operate a large network like SPK.

\n\n

This paradigm brings censorship resistant smart contracting to HIVE in a big way. And we aren't done yet.

\n\n

We would like to petition HIVE to support further development that continues to improve our ecosystem.

\n\n

DHF Proposal (TL/DR)

\n\n
  • Automated configuration for New Token Ecosystems
  • \n
  • RC Credit Automation to Support Network Traffic
  • \n
  • RC Credit Markets
  • \n
  • Account Creation Markets
  • \n
  • ZK-Rollups for L2 <-> L2 Bridging
  • \n
  • Continue Stability Improvements
  • \n
  • Documentation Improvements
  • \n
  • Refactor
\n\n

150 HBD /Day for 1 year.

\n\n

Automated configuration for New Token Ecosystems

\n\n

We hope to build and maintain easy to use tools to build new token ecosystems. Answer a few questions and generate the 2 files the will turn a Honeycomb fork into a decentralized ecosystem. Config and initial state/token distro. This will include setting up the initial DEX account with multi-sig authorities. All that will be left is for the community to clone the new repo, update the .env with there account and run the architecture.

\n\n

RC Credit Automation to Support Network Traffic

\n\n

After HF-26 the node accounts that run honeycomb can be even more secured by not having a HP delegation. While custom_json costs are expected to go up, all transactions may have additional RC costs associated with multi-sig verification. This leads to conditions that must be monitored to ensure the parties to an ecosystem can maintain their ability to send custom-Jsons as well as transfers from the DEX. This can and should be automated.

\n\n

RC Credit Markets

\n\n

Additionally RC markets can be created for much broader purpose and honeycomb is certainly a vehicle to build them.

\n\n

Account Creation Markets

\n\n

Not a single provider of free accounts hasn't been taken advantage of. Having a market for ACT(Account Creation Tokens) that can be claimed and sold with-in the current rules for ACTs would be hard to impossible on other Hive paradigms. Utilizing the already decentralized nature of the nodes on Honeycomb, accounts that have ACTs could be automated and their ACTs sold at market value with some controls like minimum price.

\n\n

~ZK-Rollups for L2 <-> L2 Bridging

\n\n

Build on similar code to bridge different honeycomb ecosystems thru use of Hive Escrow Transactions and Zero-Knowledge Proofs(Better and faster than similar oracle solutions).

\n\n

Continue Stability Improvements

\n\n

Software is always moving an improving, keeping up to date with changes and implementing fixes is a must for any far reaching architecture.

\n\n

Documentation Improvements

\n\n

Release documents and guides to help ecosystems take advantage of NFT markets, ICOs, smart contracts, and more.

\n\n

Refactor

\n\n

Improve the base code readability so it's easier for the community to get involved and improve the ecosystem with further features.

\n\n

Costs

\n\n

While server costs and other things are usually a concern I believe that each community should be able to incentive their node runners and API services internally. To that effect I'm only asking for $150/day to work on the outlined base layer architecture while maintaining significant time to aid communities in their own development. I believe services like this would normally pay in the $100/hr range, but as this is a project I firmly believe in and I don't require the biggest paychecks $150/day seems to be more than fair to keep the lights on. I want to utilize the DHF because I believe value to the overall community will be much higher than this payout and all community members should benefit from this proposal thru price action of those that intend to use these features.

\n\n

Track Record

\n\n

With over 6 years on the blockchain I have been working here exclusively for the last 4. I've brought several concepts to life from DLUX, SPK, DUAT to HashKings v1. My github is chock-full of concepts brought to life, and my posting history has discussions in a wide range of industry knowledge. I run a witness and a backup for ensured up time, where I'm currently in the top 50. (You can help improve this standing as well with a vote). I've also increased my stake to over $20k in Hive denominated assets with several thousand more on Honeycomb assets.

\n\n

Voting

\n\n

Finally

\n\n
", "canonicalUrl": "https://peakd.com/dhf/@disregardfiat/honeycomb-improvements-and-development"},{"url": "https://hive.blog/hive-112019/@spknetwork/spk-network-funding-proposal-rhnv7e", "probability": 0.93114376, "headline": "SPK Network Funding Proposal", "datePublished": "2022-08-24T04:26:05.511218", "datePublishedRaw": "8 months ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/spknetwork/242hL2uzDhzzmVSJ48v26AsLREU9V1iiQ9qQwGFJg6acJQ7JCpUuD6S7NrCQheVx5dWwc.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/242hL2uzDhzzmVSJ48v26AsLREU9V1iiQ9qQwGFJg6acJQ7JCpUuD6S7NrCQheVx5dWwc.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/behiver/23xxy8hzLsZkLXA9QACXv8MQJkkJBiNdLZzxfxDBHwNnKLRncavFBjXSv72tDbz94K28z.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/48UkptGNtyKguU66MquGzn48wBzERaKJZZiRPPiwL8DGKCx8HRM6WJrpvQDinnBrGg.png"], "description": "Note : For additional details on any of the below, please refer to the SPK Network Light Paper . Note : The SPK Network is completely open-source, and no tokens will be\u2026 by spknetwork", "articleBody": "Note: For additional details on any of the below, please refer to the SPK Network Light Paper.\n\nNote: The SPK Network is completely open-source, and no tokens will be given to the team working on it. All SPK token distribution will be done by the SPK Network protocol, which rewards peers in the Network for providing social media content infrastructure services, and via LARYNX miner token delegations.\n\nWhat the System Aims to Achieve for DPoS Graphene Technology:\n\nSONs wrapping technology helps to make DPoS the natural go-to side chain for high-fee protocols such as ETH and BTC.\nBTC, ETH, Hive, PPY, SPK, BROCA, LARYNX, DLUX, and tips are all from the same blockchain account (more chains will be added as time goes on).\nOff-chain storage for reducing content/info storage bloat from Hive layer I.\nCensorship-resistant data with the ability to delete your own off-chain content.\nIncentivised Peer to Peer content infrastructure provision system including Content Delivery, Encoding, and Storage.\nDistributed, on-chain NFT (attached image) & normal image storage on IPFS\nUser self-hosting of NFTs & other social media content.\nOwn your NFT'ss outright\nOpen-source community content governance System, using community-driven content policies and Content Portals. Each community regulates its own content.\nCompatible foundation for Layer II Honeycomb social media tokens, paving the way for smart contract and decentralised community tokens on Hive\nPermanent Service Infrastructure DeFi Pools (SIPs) which back communities with fees generated from their own DeFi pools and grow forever.\nPayments for digital services and products into an autonomous DeFi pool protocol.\nMajor sink for Hive & HBD with payments for LARYNX Miner Tokens and ad rights being locked into the SIP DeFi pool permanently\nAbility for DPos Chains to cap their governance tokens and sustain from SIP DeFi Fees instead of inflation.\nLiquid off-chain gas / bandwidth token (BROCA), which gives the option for an alternative to the Hive Resource Credits system if required.\nBROCA Gas token incentivises peers to provide improved content infrastructure services such as CDN, Encoding, and long-term storage\nHighly deflationary, SPK Governance token with long-term, diminishing inflation until supply cap is reached.\nSPK token Bond system where long-term power-ups are rewarded with additional inflation, rewards, and additional governance rights.\nSPK decentralised proposal fund with stake weighted voting and competitive price bidding for proposals.\nMeme-backed NFT Mining of rare, collectible content creator-issued NFTs.\nPerma web - content that the community/creator deems as sufficiently important will be funded such that it can be stored permanently on the SPK Network IPFS system.\nOpen-source ceramic off-chain accounts will allow for the following account features:\n\nUploading off-chain,\nCommenting off-chain\nAdjusting account settings / Syncing account data across multiple devices and frontends\nAttaching multiple blockchain accounts to the main account\n\nGeneral content Advertisement System.\nCommunities Advertisement System.\n\n3Speak Team Involvement\n\nThe 3Speak team has developed the following applications which will run on top of Hive and the SPK Network:\n\n3Speak Web Application\n3Speak Desktop Application\nDistributed IPFS storage\nFirst time Hive has the ability to self host content and guarantees upload of and ability to watch a video on Hive / IPFS ecosystem even if 3Speak web application is not available.\n\nWorking with Peerplays\n\nPeerplays Blockchain team has offered to provide their resources to help build the cross chain swaps system for SPK Network using an adaptation of their SONs technology.\n\nWorking with DLuX\n\nThe Team behind DLUX has been working layer 2 as long as anybody on Hive. They are the first team dedicated to open source, decentralized solutions for token architectures; Pioneering autonomous multi-authority control of HIVE funds to eliminate central control and single point failures. Their goals of application distribution align very nicely with our goals of video distribution creating a synergy that money alone can't buy. The level of industry knowledge and strategic input is first rate. We look forward to continuing to have our expectations met or exceeded.\n\nTeam\n\nScope of Milestones\n\nService Infrastrucure Nodes\n\nValidator nodes.\nCDN Nodes (ongoing development).\nStorage Nodes (ongoing development).\nLocal Encoding Nodes + optimisation of node cluster operations (ongoing development).\nCeramic Union Indexer - for combining off chain and onchain content feeds (ongoing development).\nSPK Network chain nodes (ongoing development).\n\nIPFS Storage System Development\n\nOngoing development of IPFS storage system and integration with Hive & SPK Network.\n\nOffchain Account Management\n\nGoal is to streamline sign up process and allow for further Hive scaling by posting content off chain.\n\nCeramic accounts integration.\nSign up & sign in with meta mask.\nBinding different blockchain accounts (ETH, POLY, BSC, BTC, HIVE) to your offchain ceramic account.\nCommenting & posting using off chain indexing system.\nSyncing comments/upvotes/playlists and other information to and from indexing ceramic nodes.\n\nBreakaway Communities / SPK Hubs\n\nGoal is to build stand-alone digital communities / Network States.\nIntegrate Ecency points system.\nToken drops using ecency points system.\nAnti-bot system.\nMultiple governance systems (DPoS, PoS, PoW, Fractal).\nPoB2 - long-term rewards of proof of brain mechanism.\nsystem to track which accounts are hosting content for the community and assign acalades.\n\n3Speak.tv\n\nDuring the development process, 3speak.tv will be refactored from the ground up to support the SPK network.\n\nDesktop App ongoing development\n\nUX (ongoing updates)\n\nIPFS/backend Side:\n\nDefault Gateway selection. Ability to change your primary IPFS gateway.\nRunning IPFS as a service/background in the app. Give users the option to disable or enable the background.\nAutomatically download videos from content creators you follow.\n\nVideo Uploading\n\nFFmpeg local encoding, Note this is done at present as an MVP but requires more extension/maturity.\nVideo timestamps. Similar to YouTube, creators can label certain sections of their video, effectively creating chapters.\nDebug Menu - continue development.\n\nDLux\n\nHoneycomb Social community Token System + SPK Network Tokenomics\n\nLarynx miner purchase mechanism.\nBroca incentive token release.\nDevelop Broca functionality (paying for storage, CDN, encoding, and other infrastructure)\nStake weighted voting system for setting Network variables.\nSPK DAO.\nMining Rewards distribution system.\nBreak away community honeycomb separate node spin-up system (to create separate, stand-alone node networks from SPK Network node operators & cut community token inflation to these node operators).\nIntegrate Token system into Desktop App.\nSPK Network Bond System.\nThe longer and more you Power Up, the higher your interest rate is. Additionally, the longer the Power-up is locked in, the more influential the governance vote becomes, rewarding long-term holders with proportionally more influence the longer they Power Up.\nContent creators / communities can create their own tokens, including token Staking.\nVoting system where voter can see what infrastructure each node operator is running and vote them into validator node top concenus psotion with SPK tokens.\n\nMining Mechanisms\n\nInterfacing / integrating Honeycomb with SPK Network mining mechanisms.\nStorage (Proof of Access) Mining system.\nCDN rewarding.\nEncoder Node rewarding.\nService Node rewarding.\n\nNFTs\n\nNFT Market place and bidding platform.\nNFT Storage on IPFS.\nNFT mining by Staking Creator Tokens.\nNFT Memes System.\n\nService Infrastructure Pool (SIP)\n\nDeFi Mechanisms0.\nPayments into SIP.\nPayments out to Project Funding Pool.\nPayments out to Support Infrastructure in times of low payouts.\nStaking.\nToken Wrapping.\nCommunity Liquidity pools & DeFi.\n\nProposal Bidding Platform\n\nhttps://speakbounties.herokuapp.com/\n\n(This is an MVP - not yet completed)\n\nSPK Network Proposal System\n\nTask Setting System.\nFunding mechanism from SPK DAO.\nBidding System.\nVoting System.\n\nPeer Plays\n\nSidechain Operating Nodes (SONs): a trustless, decentralized PoS cross-chain mechanism for running cross-chain swap mechanisms.\nSPK SONs.\nHive SONs.\nBuild Interface / desktop plug-in to Swap tokens from / to SPK to BTC / Solidity Chains (ETH, POLY, BSC) without user needing to login to peer plays.\nUse SONs tech to Wrap Solidity tokens (ETH, POLY, BSC), SPK Network tokens, and BTC to and from SPK Chain without user needing to login to peer plays.\n\nAPI System\n\nTo include video uploading and delivery initially but ultimately to allow any platform to easily integrate any web3 tool that is available on the SPK Network. (See technical overview for further details).\n\nMuting and Blocklists/content policy system\n\nAs described in Light Paper & technical overview.\n\nContent Gateways Portals\n\nAs described in Light Paper & technical overview.\n\nServers\n\nCost of Servers/experimental infrastructure. Funding will be used accordingly to operate development and testnet infrastructure.\n\nBudget\n\n390.09 USD per day for 365 days = 142,382.85 USD.\n\nInsurance Guarantee of Funds\n\nThe funds will be sent to the account @spkproposal and distributed to cover the costs for execution of the above scope from there. The SPK Network Proposal will have two trusted Hive Community members as the trustees to the funds received by the proposal. @starkerz & @theycallmedan both will take on responsibility for these funds and will ensure they are distributed to the SPK network developers, and will provide their guarantee that they will reimburse this proposal in the event of any funds lost.\n\nFinal notes:\n\nIf you have any comments, concerns, confusion, or parts of this proposal or attached documents, feel free to reach out to us. We will be happy to answer.\n\nAbout the SPK Network:\n\nSPK Network, a decentralised, censorship-resistant social media protocol and incentivization layer for web3. The SPK Network provides the ability for video platforms and content creators to interact with the decentralized social graph, while rewarding infrastructure providers with SPK governance and BROCA gas tokens.\n\nSupport this proposal:", "articleBodyHtml": "
\n\n
\"image.png\"
\n\n

Note: For additional details on any of the below, please refer to the SPK Network Light Paper.

\n\n

Note: The SPK Network is completely open-source, and no tokens will be given to the team working on it. All SPK token distribution will be done by the SPK Network protocol, which rewards peers in the Network for providing social media content infrastructure services, and via LARYNX miner token delegations.

\n\n
\"image.png\"
\n\n

What the System Aims to Achieve for DPoS Graphene Technology:

\n\n
  • SONs wrapping technology helps to make DPoS the natural go-to side chain for high-fee protocols such as ETH and BTC.
  • \n
  • BTC, ETH, Hive, PPY, SPK, BROCA, LARYNX, DLUX, and tips are all from the same blockchain account (more chains will be added as time goes on).
  • \n
  • Off-chain storage for reducing content/info storage bloat from Hive layer I.
  • \n
  • Censorship-resistant data with the ability to delete your own off-chain content.
  • \n
  • Incentivised Peer to Peer content infrastructure provision system including Content Delivery, Encoding, and Storage.
  • \n
  • Distributed, on-chain NFT (attached image) & normal image storage on IPFS
  • \n
  • User self-hosting of NFTs & other social media content.
  • \n
  • Own your NFT'ss outright
  • \n
  • Open-source community content governance System, using community-driven content policies and Content Portals. Each community regulates its own content.
  • \n
  • Compatible foundation for Layer II Honeycomb social media tokens, paving the way for smart contract and decentralised community tokens on Hive
  • \n
  • Permanent Service Infrastructure DeFi Pools (SIPs) which back communities with fees generated from their own DeFi pools and grow forever.
  • \n
  • Payments for digital services and products into an autonomous DeFi pool protocol.
  • \n
  • Major sink for Hive & HBD with payments for LARYNX Miner Tokens and ad rights being locked into the SIP DeFi pool permanently
  • \n
  • Ability for DPos Chains to cap their governance tokens and sustain from SIP DeFi Fees instead of inflation.
  • \n
  • Liquid off-chain gas / bandwidth token (BROCA), which gives the option for an alternative to the Hive Resource Credits system if required.
  • \n
  • BROCA Gas token incentivises peers to provide improved content infrastructure services such as CDN, Encoding, and long-term storage
  • \n
  • Highly deflationary, SPK Governance token with long-term, diminishing inflation until supply cap is reached.
  • \n
  • SPK token Bond system where long-term power-ups are rewarded with additional inflation, rewards, and additional governance rights.
  • \n
  • SPK decentralised proposal fund with stake weighted voting and competitive price bidding for proposals.
  • \n
  • Meme-backed NFT Mining of rare, collectible content creator-issued NFTs.
  • \n
  • Perma web - content that the community/creator deems as sufficiently important will be funded such that it can be stored permanently on the SPK Network IPFS system.
  • \n
  • Open-source ceramic off-chain accounts will allow for the following account features:\n
      \n
    • Uploading off-chain,
    • \n
    • Commenting off-chain
    • \n
    • Adjusting account settings / Syncing account data across multiple devices and frontends
    • \n
    • Attaching multiple blockchain accounts to the main account
    • \n
  • \n
  • General content Advertisement System.
  • \n
  • Communities Advertisement System.
\n\n

3Speak Team Involvement

\n\n

The 3Speak team has developed the following applications which will run on top of Hive and the SPK Network:

\n\n
  • 3Speak Web Application
  • \n
  • 3Speak Desktop Application
  • \n
  • Distributed IPFS storage
  • \n
  • First time Hive has the ability to self host content and guarantees upload of and ability to watch a video on Hive / IPFS ecosystem even if 3Speak web application is not available.
\n\n

Working with Peerplays

\n\n

Peerplays Blockchain team has offered to provide their resources to help build the cross chain swaps system for SPK Network using an adaptation of their SONs technology.

\n\n

Working with DLuX

\n\n

The Team behind DLUX has been working layer 2 as long as anybody on Hive. They are the first team dedicated to open source, decentralized solutions for token architectures; Pioneering autonomous multi-authority control of HIVE funds to eliminate central control and single point failures. Their goals of application distribution align very nicely with our goals of video distribution creating a synergy that money alone can't buy. The level of industry knowledge and strategic input is first rate. We look forward to continuing to have our expectations met or exceeded.

\n\n

Team

\n\n

Scope of Milestones

\n\n

Service Infrastrucure Nodes

\n\n
  • Validator nodes.
  • \n
  • CDN Nodes (ongoing development).
  • \n
  • Storage Nodes (ongoing development).
  • \n
  • Local Encoding Nodes + optimisation of node cluster operations (ongoing development).
  • \n
  • Ceramic Union Indexer - for combining off chain and onchain content feeds (ongoing development).
  • \n
  • SPK Network chain nodes (ongoing development).
\n\n

IPFS Storage System Development

\n\n

Ongoing development of IPFS storage system and integration with Hive & SPK Network.

\n\n

Offchain Account Management

\n\n

Goal is to streamline sign up process and allow for further Hive scaling by posting content off chain.

\n\n
  • Ceramic accounts integration.
  • \n
  • Sign up & sign in with meta mask.
  • \n
  • Binding different blockchain accounts (ETH, POLY, BSC, BTC, HIVE) to your offchain ceramic account.
  • \n
  • Commenting & posting using off chain indexing system.
  • \n
  • Syncing comments/upvotes/playlists and other information to and from indexing ceramic nodes.
\n\n

Breakaway Communities / SPK Hubs

\n\n
  • Goal is to build stand-alone digital communities / Network States.
  • \n
  • Integrate Ecency points system.
  • \n
  • Token drops using ecency points system.
  • \n
  • Anti-bot system.
  • \n
  • Multiple governance systems (DPoS, PoS, PoW, Fractal).
  • \n
  • PoB2 - long-term rewards of proof of brain mechanism.
  • \n
  • system to track which accounts are hosting content for the community and assign acalades.
\n\n

3Speak.tv

\n\n

During the development process, 3speak.tv will be refactored from the ground up to support the SPK network.

\n\n

Desktop App ongoing development

\n\n
UX (ongoing updates)
\n\n
IPFS/backend Side:
\n\n
  • Default Gateway selection. Ability to change your primary IPFS gateway.
  • \n
  • Running IPFS as a service/background in the app. Give users the option to disable or enable the background.
  • \n
  • Automatically download videos from content creators you follow.
\n\n
Video Uploading
\n\n
  • FFmpeg local encoding, Note this is done at present as an MVP but requires more extension/maturity.
  • \n
  • Video timestamps. Similar to YouTube, creators can label certain sections of their video, effectively creating chapters.
  • \n
  • Debug Menu - continue development.
\n\n

DLux

\n\n
Honeycomb Social community Token System + SPK Network Tokenomics
\n\n
  • Larynx miner purchase mechanism.
  • \n
  • Broca incentive token release.
  • \n
  • Develop Broca functionality (paying for storage, CDN, encoding, and other infrastructure)
  • \n
  • Stake weighted voting system for setting Network variables.
  • \n
  • SPK DAO.
  • \n
  • Mining Rewards distribution system.
  • \n
  • Break away community honeycomb separate node spin-up system (to create separate, stand-alone node networks from SPK Network node operators & cut community token inflation to these node operators).
  • \n
  • Integrate Token system into Desktop App.
  • \n
  • SPK Network Bond System.
  • \n
  • The longer and more you Power Up, the higher your interest rate is. Additionally, the longer the Power-up is locked in, the more influential the governance vote becomes, rewarding long-term holders with proportionally more influence the longer they Power Up.
  • \n
  • Content creators / communities can create their own tokens, including token Staking.
  • \n
  • Voting system where voter can see what infrastructure each node operator is running and vote them into validator node top concenus psotion with SPK tokens.
\n\n
Mining Mechanisms
\n\n
  • Interfacing / integrating Honeycomb with SPK Network mining mechanisms.
  • \n
  • Storage (Proof of Access) Mining system.
  • \n
  • CDN rewarding.
  • \n
  • Encoder Node rewarding.
  • \n
  • Service Node rewarding.
\n\n
NFTs
\n\n
  • NFT Market place and bidding platform.
  • \n
  • NFT Storage on IPFS.
  • \n
  • NFT mining by Staking Creator Tokens.
  • \n
  • NFT Memes System.
\n\n
Service Infrastructure Pool (SIP)
\n\n
  • DeFi Mechanisms0.
  • \n
  • Payments into SIP.
  • \n
  • Payments out to Project Funding Pool.
  • \n
  • Payments out to Support Infrastructure in times of low payouts.
  • \n
  • Staking.
  • \n
  • Token Wrapping.
  • \n
  • Community Liquidity pools & DeFi.
\n\n
Proposal Bidding Platform
\n\n

https://speakbounties.herokuapp.com/

\n\n

(This is an MVP - not yet completed)

\n\n
SPK Network Proposal System
\n\n
  • Task Setting System.
  • \n
  • Funding mechanism from SPK DAO.
  • \n
  • Bidding System.
  • \n
  • Voting System.
\n\n

Peer Plays

\n\n
  • Sidechain Operating Nodes (SONs): a trustless, decentralized PoS cross-chain mechanism for running cross-chain swap mechanisms.
  • \n
  • SPK SONs.
  • \n
  • Hive SONs.
  • \n
  • Build Interface / desktop plug-in to Swap tokens from / to SPK to BTC / Solidity Chains (ETH, POLY, BSC) without user needing to login to peer plays.
  • \n
  • Use SONs tech to Wrap Solidity tokens (ETH, POLY, BSC), SPK Network tokens, and BTC to and from SPK Chain without user needing to login to peer plays.
\n\n

API System

\n\n

To include video uploading and delivery initially but ultimately to allow any platform to easily integrate any web3 tool that is available on the SPK Network. (See technical overview for further details).

\n\n

Muting and Blocklists/content policy system

\n\n

As described in Light Paper & technical overview.

\n\n

Content Gateways Portals

\n\n

As described in Light Paper & technical overview.

\n\n

Servers

\n\n

Cost of Servers/experimental infrastructure. Funding will be used accordingly to operate development and testnet infrastructure.

\n\n

Budget

\n\n

390.09 USD per day for 365 days = 142,382.85 USD.

\n\n

Insurance Guarantee of Funds

\n\n

The funds will be sent to the account @spkproposal and distributed to cover the costs for execution of the above scope from there. The SPK Network Proposal will have two trusted Hive Community members as the trustees to the funds received by the proposal. @starkerz & @theycallmedan both will take on responsibility for these funds and will ensure they are distributed to the SPK network developers, and will provide their guarantee that they will reimburse this proposal in the event of any funds lost.

\n\n

Final notes:

\n\n

If you have any comments, concerns, confusion, or parts of this proposal or attached documents, feel free to reach out to us. We will be happy to answer.

\n\n
About the SPK Network:
\n\n

SPK Network, a decentralised, censorship-resistant social media protocol and incentivization layer for web3. The SPK Network provides the ability for video platforms and content creators to interact with the decentralized social graph, while rewarding infrastructure providers with SPK governance and BROCA gas tokens.

\n\n

Support this proposal:

\n\n
\"image.png\"
\n\n
", "canonicalUrl": "https://peakd.com/hive-112019/@spknetwork/spk-network-funding-proposal-rhnv7e"},{"url": "https://hive.blog/hive-112019/@spknetwork/spk-network-tokens-logo-contest", "probability": 0.9275982, "headline": "SPK Network Tokens Logo Contest | 3,000 HBD in Prizes!", "datePublished": "2022-05-24T04:26:14.782196", "datePublishedRaw": "11 months ago", "author": "lisbethseijas", "authorsList": ["lisbethseijas"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/spknetwork/23u5zy3kXPLYCmr8zKLxcMmwiY2wzTw8AcqJfGaUNHeCyK9NcKYnmKg4uMGofn644HFme.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23u5zy3kXPLYCmr8zKLxcMmwiY2wzTw8AcqJfGaUNHeCyK9NcKYnmKg4uMGofn644HFme.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/AJoKXcaSz2CGFrELeWgkhgg6NF6b2p45Rzy1BVcRrLKVBwJjhBD2CaK8xeEKzBX.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/aichanbot/242honUYquZ5C3snsk1321FZdztiBp256RUMGxqxSagjcgehzFrewPq5ZHCpEaEusbKEx.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/lisbethseijas/23tRtHGTphoyo3hGksngN2QAJoCy3M97Xp3LWMVrZWkqUohBUAN3tYoZHmfFiaJcK6YQY.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/aichanbot/23tmUFe89HqfPYLBzsypkQzk2LatmERTzY1rSbm5JEBWV1YsM3N6BCoxE9APh6YDLPX4X.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/lawrence27/AK7yKc2QoTqJLELN2jj8JW1Chh4SjiYQsz8wJLX2ku5fhr9Q6SjoQsFMvtX5F8A.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/robert0z/243zvZycR1kEUcJ4QhjqzeMF7x38LTqvMD5p9FpndCWQSijure4EaJrh4TGDDN7U1Hj1W.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/robert0z/23xViWnuJK3Z4higag75wmJn8z8SHGxPUP7phnJPY2HPqf31oFCBuwBnnE2UjKJwH6RUU.jpg", "https://images.hive.blog/DQmY6NiSKmyAqjXczuSLz5mWrmKXoA2uZHrr9W72WWDAcCT/spk%20logo%20redondo%20listo.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/robert0z/23yU2L4DcEpa1hmQgvMcQr5yKpUGVmqTaGQdeLQnaRm68TTpT157TBpKhsa1E1RjoTZee.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/charsdesign/23tbCjEuA2uEvPnfQbCsqNPRYPaWWbGme3CKni67ZEZrv9CYdcRMT39dv2DWgAX9zjVhH.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/robert0z/23yxAY43fkGS4YM4ugBK3ZF1SRYwZsM6gSVaBJKXVidHANHR8yNVLqsr641WXyBJjoJYC.jpg", "https://images.hive.blog/0x0/https://files.peakd.com/file/peakd-hive/camiloferrua/23vPduNYBKBYUbT1ngYZWW73vWZ2vce7s5RBAhqYjC9YDRPvcGaAVw8xEyY6hpeWykrFH.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/cetb2008/23tkhyRmCBLzxDRkmCufUq6xn6Bs4bqjXPRSaXJWtemwG9vzTqdU6TkiE94KLGwxM9UfU.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/robert0z/241toxTyuwjPU1uRkLo4pL233RrkDNQSnKheg9kuF1WMyQCLdMELMcJnbxxF6h6rrCk43.jpg", "https://images.hive.blog/768x0/https://images.ecency.com/DQmVrbVhMfTpDCx2QsFkNptryzUmJABi6b3kVqX4zNsxB3d/spk_laxrim.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/cetb2008/23uEjFNREGAfX3m2KRCqbXax5z1krPsA9zFNRjMJ2pqtcUVF57N4qpZXCsQWaUGkhE4Ph.png", "https://images.hive.blog/768x0/https://images.ecency.com/DQmQc4HfMUf3Q3i93BcL73tmTp4jeVN8vYMCkGkgWFgRhR1/rect199.png", "https://images.hive.blog/DQmXjbaVQjZkAzW3nZT18GSu6KmM2hSVzMADBrkFLjLvYTE/broca%20logo.png", "https://images.hive.blog/0x0/https://files.peakd.com/file/peakd-hive/camiloferrua/23wgoDUM72UnUEFWqZDhM9FZ6rdQwie3D9627RDv2uqUrwNcrVvVcbWGskwGAd8tN7XZF.png", "https://images.hive.blog/DQmdBNXrmxTApBNaX3zC7p4svPn1jsQBMiayKHa3Fa9LrzM/larynx%20logo%20redondo.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/elliptique/23z7WxoQEz54etP45MfUmyFGLqM2ENiVSJvFRGxJmqgNa9ukefbUoKjzGgZco7UQ3Z9pr.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/hardaeborla/AL4YqEEzdwDZzovP1m2e6XEWsxYgVKDoQsLM4Wd2c7nZywTcqcrnBhzLCmcUcvT.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/charsdesign/23tvY8fkGYu5qJL4ywCjmiEWx6Ji5QZTov6A1RhSEPZQXcMZPPGLYW1aSyfALHzEzdXR1.jpg", "https://images.hive.blog/768x0/https://images.ecency.com/DQmPLaBSNAZboKeGWWcVHes2xiboGDCUDhRHGPwpYZk1VnZ/speak_network.jpg", "https://images.hive.blog/DQmX3Ex625habJAoQTx3vHorRHW42Zd6Ks3tG5JKarPPqE1/IMG-20220411-WA0008.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/hardaeborla/AKGe1dqrdN88yqcHMn5ncaTJb7JfcRaLwzvzdh4CfX6ZC7od3HLnCNsnGJR8PWQ.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/cetb2008/23tkowUirVtpbBNPa3bJeoqFHX83cqt4vJBV8ySmaANzmG2eFS3GKNkggYpdEPWNm8dXd.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/hardaeborla/241tsznJwfjZYaXoJfDhdEcJ3diwwcCXsRAGzYXZoz4teEMV3ke5iJSWaUtJ3ZKbqV62c.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/zottone444/23t7AyKqAfdxKEJPQrpePMW15BCPhbyrf5VoHWxhBFcEcPLjDUVVQAh9ZAopbmoJDekS6.png"], "description": "Hello Hivers! We are happy to announce the SPK Network Tokens Logos Contest . In this post, you will find the details on how you can participate in the contest and the\u2026 by spknetwork", "articleBody": "Hello Hivers!\n\nWe are happy to announce the SPK Network Tokens Logos Contest. In this post, you will find the details on how you can participate in the contest and the prizes that we will give to the winners. This is a community project, and we want you to be part of it.\n\nSPK Network has three main tokens:\n\nLARYNX Miner Tokens\n\nLARYNX Miner tokens should be thought as, as physical miner rigs, but in a digital form. The only way SPK Governance tokens can be earned is by staking LARYNX Miner tokens and running SPK Network Peer to Peer infrastructure nodes. When staked, LARYNX Miner Tokens are locked permanently so as to identify legitimate infrastructure miners, willing to stake value into the network. The more LARYNX burned for mining, the more profitable/efficient the mining becomes.\n\nBROCA\n\nBROCA is the Network's gas token to limit spam. It is consumed when the user uploads content to the Network and automatically regenerates each day if the user powers up their liquid SPK tokens.\n\nSPK\n\nSPK token is the capped governance token of the Network. An SPK token holder is able to influence the governance of the Network with their voting weight, based on how much of the SPK token they have Powered up. In order to vote a user must have powered up their SPK for at least 30 days. This gives the Network time to protect itself in case of a Sybil-type attack.\n\nImportant: This logo contest is for SPK the token and NOT for the SPK Network.\n\nSPK NETWORK LIGHT PAPER.\n\nWe will choose the best logos of each token (LARYNX, BROCA, and SPK).\n\nThere can be only one winner or up to three different winners.\n\nEach logo winner will be rewarded with 1,000 HBD.\n\nIf you make the three winning logos you can win up to 3,000 HBD!", "articleBodyHtml": "
\n\n

Hello Hivers!

\n\n

We are happy to announce the SPK Network Tokens Logos Contest. In this post, you will find the details on how you can participate in the contest and the prizes that we will give to the winners. This is a community project, and we want you to be part of it.

\n\n
\"image.png\"
\n\n

SPK Network has three main tokens:

\n\n
  • LARYNX Miner Tokens
\n\n

LARYNX Miner tokens should be thought as, as physical miner rigs, but in a digital form. The only way SPK Governance tokens can be earned is by staking LARYNX Miner tokens and running SPK Network Peer to Peer infrastructure nodes. When staked, LARYNX Miner Tokens are locked permanently so as to identify legitimate infrastructure miners, willing to stake value into the network. The more LARYNX burned for mining, the more profitable/efficient the mining becomes.

\n\n
  • BROCA
\n\n

BROCA is the Network's gas token to limit spam. It is consumed when the user uploads content to the Network and automatically regenerates each day if the user powers up their liquid SPK tokens.

\n\n
  • SPK
\n\n

SPK token is the capped governance token of the Network. An SPK token holder is able to influence the governance of the Network with their voting weight, based on how much of the SPK token they have Powered up. In order to vote a user must have powered up their SPK for at least 30 days. This gives the Network time to protect itself in case of a Sybil-type attack.

\n\n

Important: This logo contest is for SPK the token and NOT for the SPK Network.

\n\n

SPK NETWORK LIGHT PAPER.

\n\n
  • We will choose the best logos of each token (LARYNX, BROCA, and SPK).

    \n
  • \n
  • There can be only one winner or up to three different winners.

    \n
  • \n
  • Each logo winner will be rewarded with 1,000 HBD.

    \n
  • \n
  • If you make the three winning logos you can win up to 3,000 HBD!

    \n
\n\n
", "canonicalUrl": "https://peakd.com/hive-112019/@spknetwork/spk-network-tokens-logo-contest"},{"url": "https://hive.blog/witness-category/@roelandp/witness-roelandp", "probability": 0.83715403, "headline": "Witness @roelandp", "datePublished": "2016-04-24T04:26:22.543446", "datePublishedRaw": "7 years ago", "author": "roelandp", "authorsList": ["roelandp"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://s16.postimg.org/s8vdjui5x/roelandpforwitness.jpg", "description": "The time has come to rebrand my early witness test @paynode (#49) and reveal my intention as witness @roelandp and upgrade all my efforts for this. I want to seriously add my\u2026 by roelandp", "articleBody": "The time has come to rebrand my early witness test @paynode (#49) and reveal my intention as witness @roelandp and upgrade all my efforts for this. I want to seriously add my fair share of server support to this blockchain and have prepared for this in the past weeks:\n\nI've run @paynode for about 7/8 weeks without any downtime and 2 blocks missed because of initial misconfiguration 1 block missed and 2 blocks missed because of a replay of the blockchain after a hardfork update. I know this shouldn't happen but I was a backup witness without a backup node... Those things have changed now.\n\nIf you want to learn more about me, I recommend you read my introduceyourself post. It deals about kitesurfing, windmills, pancakes, festivals and how I enrolled in computer life.\n\nThrough my company, Shoudio (a badly chosen \"Shout-Audio\" acronym), CMS and app production for location based media and numerous other web apps I've gained considerable knowledge in the fields of server maintenance and cloud based intense usage \"solutions\". At home I have some Raspberry Pi's with Arch Linux, Raspian, Kodi XBMC. Here is a RaspiCam to Youtube stream tutorial for you.\n\nMy @roelandp witness infrastructure is ready to down scale :) when needed. All my servers are firewalled and properly protected following best practices in server security.\n\nType Dedicated\nProcessor Intel i7-6700\nRam 64 GB DDR4\nHDD 2 x 250 SSD\nConnection 1 Gbit/s\n\nSeed node (seed.roelandp.nl:2001):\n\nType Dedicated\nProcessor Intel Xeon D-1521\nRam 32 GB DDR4\nHDD 2 x 2 TB\nConnection 1 Gbit/s\nLocation Canada\n\nBackup Witness node:\n\nType VPS\nProcessor x86 64bit (8x)\nRam 16 GB DDR4\nHDD 50 GB SSD\nConnection 500 Mbit/s\n\nWhat I have brought to Steem to date:\n\nWhat I want to bring for Steem in the future:", "articleBodyHtml": "
\n\n
\n\n

The time has come to rebrand my early witness test @paynode (#49) and reveal my intention as witness @roelandp and upgrade all my efforts for this. I want to seriously add my fair share of server support to this blockchain and have prepared for this in the past weeks:

\n\n

I've run @paynode for about 7/8 weeks without any downtime and 2 blocks missed because of initial misconfiguration 1 block missed and 2 blocks missed because of a replay of the blockchain after a hardfork update. I know this shouldn't happen but I was a backup witness without a backup node... Those things have changed now.

\n\n

If you want to learn more about me, I recommend you read my introduceyourself post. It deals about kitesurfing, windmills, pancakes, festivals and how I enrolled in computer life.

\n\n

Through my company, Shoudio (a badly chosen \"Shout-Audio\" acronym), CMS and app production for location based media and numerous other web apps I've gained considerable knowledge in the fields of server maintenance and cloud based intense usage \"solutions\". At home I have some Raspberry Pi's with Arch Linux, Raspian, Kodi XBMC. Here is a RaspiCam to Youtube stream tutorial for you.

\n\n

My@roelandp witness infrastructure is ready to downscale :) when needed. All my servers are firewalled and properly protected following best practices in server security.

\n\n
Type Dedicated
Processor Intel i7-6700
Ram 64 GB DDR4
HDD 2 x 250 SSD
Connection 1 Gbit/s
\n\n

Seed node (seed.roelandp.nl:2001):

\n\n
Type Dedicated
Processor Intel Xeon D-1521
Ram 32 GB DDR4
HDD 2 x 2 TB
Connection 1 Gbit/s
Location Canada
\n\n

Backup Witness node:

\n\n
Type VPS
Processor x86 64bit (8x)
Ram 16 GB DDR4
HDD 50 GB SSD
Connection 500 Mbit/s
\n\n

What I have brought to Steem to date:

\n\n

What I want to bring for Steem in the future:

\n\n
", "canonicalUrl": "https://hive.blog/witness-category/@roelandp/witness-roelandp"},{"url": "https://hive.blog/hive-112019/@spknetwork/bnsdtkbq", "probability": 0.83652955, "headline": "SPK Network's Governance Token Testnet", "datePublished": "2022-07-24T04:26:22.913864", "datePublishedRaw": "9 months ago", "author": "@eddiespino", "authorsList": ["@eddiespino"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/spknetwork/23w2kqSTmh4DpNXH6js2dzb5mJqQkYBbLuqCQuPgsEw144XsTzWaJ6r81Td6ZQ7j7bPdt.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23tRvWxYWWWaiq6jTUWfNgNKPbFd1X9GdLwBmj1VC84TzKwL9dcyB8L5trWDiuKERkkQW.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23tbKBM5r7AbYPhtSVuWwGH2eZP9rBNrQ5u97Qa3w6nn9EnHM82x5qvHr3cqJwggrbReb.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23w2kqSTmh4DpNXH6js2dzb5mJqQkYBbLuqCQuPgsEw144XsTzWaJ6r81Td6ZQ7j7bPdt.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/EoyPvPJ85q7QZcxhjoPDatQ122vdsiFfk8WvJCrBb3gEsQFPqhSD5Nvzn4iobaaZD6q.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/disregardfiat/23vrtseCDnY9diY8TaqjxQP1X6mTeNTRM3r3fLq2VypUzCj6kcLhMQkv1NjjHgzXhTnJf.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/zottone444/23t7AyKqAfdxKEJPQrpePMW15BCPhbyrf5VoHWxhBFcEcPLjDUVVQAh9ZAopbmoJDekS6.png", "https://3speak.tv/embed?v=spknetwork/bnsdtkbq"], "description": "\u25b6\ufe0f Watch on 3Speak The SPK Claim Chain is becoming the SPK network. These changes are happening slowly to ensure the community owns this network from the\u2026 by spknetwork", "articleBody": "\u25b6\ufe0f Watch on 3Speak\n\nThe SPK Claim Chain is becoming the SPK network. These changes are happening slowly to ensure the community owns this network from the beginning. Let's review.\n\nThere are three token types in our proposed ecosystem. LARYNX, SPK, and BROCA. LARYNX is the miner token with an ongoing airdrop that can be claimed monthly. SPK is the governance token that will help turn our HIVE \"side-chain\" or \"layer 2\" into a DPoS (Delegated Proof-of-Stake) layer 2. This current change being tested allows our Larynx miner stakers to earn a very small amount of SPK to build a seedling governance. At our proposed distribution rates of 0.1, 0.015(x2), and 0.01%, we will expect no more than 15,000 SPK tokens to be minted by the next release; after which these tokens will vote a top 20(and even the number 20) which will then vote for the networks validators and other key network variables.\n\nA few features have been added for this release so let's cover them one by one. First, log in at dlux.io's wallet\n\nYou might have a claim available (if so, actually claim it at dlux.io/dex), but you'll need to claim it on the test net to play with these TEST tokens.\n\nThe TESTLARYNX menu will allow you to power up your mining tokens. If you run a SPKCC node, the Lock Liquidity menu will also be available, earning the highest return rate.\n\nNow that you've powered up your LARYNX tokens, you're earning the base staking reward of 0.01% APR on your powered tokens. You can earn 50% more by delegating to a network provider account. You'll both earn 0.015% on the delegated amount. In the future, we hope this encourages network providers to provide better infrastructure and services without having to finance virtual miners and provide real services.\n\nFinally. You can see and change your delegations by clicking the magnifying glass:\n\nWe don't expect this test period to last very long. Soon we'll be earning SPK tokens! Please leave a comment with your feedback.", "articleBodyHtml": "
\n\n
\n\n

\u25b6\ufe0f Watch on 3Speak

\n\n
\"spkgovtestnet.png\"
\n\n

The SPK Claim Chain is becoming the SPK network. These changes are happening slowly to ensure the community owns this network from the beginning. Let's review.

\n\n

There are three token types in our proposed ecosystem. LARYNX, SPK, and BROCA. LARYNX is the miner token with an ongoing airdrop that can be claimed monthly. SPK is the governance token that will help turn our HIVE \"side-chain\" or \"layer 2\" into a DPoS (Delegated Proof-of-Stake) layer 2. This current change being tested allows our Larynx miner stakers to earn a very small amount of SPK to build a seedling governance. At our proposed distribution rates of 0.1, 0.015(x2), and 0.01%, we will expect no more than 15,000 SPK tokens to be minted by the next release; after which these tokens will vote a top 20(and even the number 20) which will then vote for the networks validators and other key network variables.

\n\n

A few features have been added for this release so let's cover them one by one. First, log in at dlux.io's wallet

\n\n
\"image.png\"
\n\n

You might have a claim available (if so, actually claim it at dlux.io/dex), but you'll need to claim it on the test net to play with these TEST tokens.

\n\n

The TESTLARYNX menu will allow you to power up your mining tokens. If you run a SPKCC node, the Lock Liquidity menu will also be available, earning the highest return rate.

\n\n
\"image.png\"
\n\n

Now that you've powered up your LARYNX tokens, you're earning the base staking reward of 0.01% APR on your powered tokens. You can earn 50% more by delegating to a network provider account. You'll both earn 0.015% on the delegated amount. In the future, we hope this encourages network providers to provide better infrastructure and services without having to finance virtual miners and provide real services.

\n\n
\"image.png\"
\n\n

Finally. You can see and change your delegations by clicking the magnifying glass:

\n\n
\"image.png\"
\n\n

We don't expect this test period to last very long. Soon we'll be earning SPK tokens! Please leave a comment with your feedback.

\n\n
", "canonicalUrl": "https://hive.blog/hive-112019/@spknetwork/bnsdtkbq"},{"url": "https://hive.blog/hive-181335/@threespeak/invitation-to-become-a-3speak-encoder-node-operator", "probability": 0.90053105, "headline": "Invitation to Become a 3Speak Encoder Node Operator", "datePublished": "2022-04-24T04:26:24.556844", "datePublishedRaw": "last year", "author": "threespeak", "authorsList": ["threespeak"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/threespeak/23u5xipEadDSj6Dqj42Eyj6r8yNXJMQhGXJpxCtPMoQ4QRikX21fQb3N3J7qUSZeEsRep.png", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/threespeak/23u5xipEadDSj6Dqj42Eyj6r8yNXJMQhGXJpxCtPMoQ4QRikX21fQb3N3J7qUSZeEsRep.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/threespeak/23tSwyU2FpH4kyQK7r9jAXWmogg5rjcuJbHferxguvm3gA4M71gqbrbuVyNcWgy7CE31a.png"], "description": "Hello Hivers! A few months ago, we made a post inviting 3Speak users to become beta testers . Today we are taking the next step. We are inviting you to become a 3Speak\u2026 by threespeak", "articleBody": "Hello Hivers!\n\nA few months ago, we made a post inviting 3Speak users to become beta testers. Today we are taking the next step. We are inviting you to become a 3Speak Encoder Node Operator.", "articleBodyHtml": "
\n\n
\"nodeoperator.png\"
\n\n

Hello Hivers!

\n\n

A few months ago, we made a post inviting 3Speak users to become beta testers. Today we are taking the next step. We are inviting you to become a 3Speak Encoder Node Operator.

\n\n
\"image.png\"
\n\n
", "canonicalUrl": "https://peakd.com/hive-181335/@threespeak/invitation-to-become-a-3speak-encoder-node-operator"},{"url": "https://hive.blog/witness-category/@smooth.witness/smooth-witness", "probability": 0.94455934, "headline": "smooth witness", "datePublished": "2016-04-24T04:26:44.111165", "datePublishedRaw": "7 years ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/", "description": "I\u2019m \u2018smooth\u2019, well known and active in the cryptocurrency community since 2011, core team member of Monero and lead developer of AEON. I have not been involved with Bitshares\u2026 by smooth.witness", "articleBody": "I\u2019m \u2018smooth\u2019, well known and active in the cryptocurrency community since 2011, core team member of Monero and lead developer of AEON.\n\nI have not been involved with Bitshares or other DPoS so operating a witness node is new to me. However, I have years of experience with development and deployment of high availability mission critical infrastructure, HPC big data, along with various cryptocurrency nodes, mining operations, and services. I\u2019ve also been mining and operating steem nodes since the launch in March 2016.\n\nI am one of the largest holders of STEEM independent of the original team, and my objective in operating a witness node includes working to ensure that the network is secure, reliable and scalable, in order to protect and grow the value of my stake. Although my identity is not public, I do not and have not ever operated any sock puppets or other misleading identities, and my five-year history in the community provides ample objective support for such a statement. Further, I state unequivocally that I have no affiliation with the steemit team, any of the steem developers, or any of the other witness operators outside of our normal online interactions. Thus I can promise that my witness node will be operated in fully-independent manner, faithful to the network rules and the best interests of my own stake and that of the other stakeholders.\n\nTo that end I have provisioned redundant enterprise-class hardware in a low-latency Tier 2 datacenter at an undisclosed location. This is dedicated hardware, not deployed on AWS or another cloud. There is more than ample excess CPU, memory, and storage to support rapid network growth, and all can be easily scaled as needed. Standby hardware is already online for fail-over, and backup witness nodes at additional locations will be added. In addition I will be providing a full-time seed node, currently located in AWS Singapore (IP below). These are fully updated with latest patches from github.", "articleBodyHtml": "
\n\n

I\u2019m \u2018smooth\u2019, well known and active in the cryptocurrency community since 2011, core team member of Monero and lead developer of AEON.

\n\n

I have not been involved with Bitshares or other DPoS so operating a witness node is new to me. However, I have years of experience with development and deployment of high availability mission critical infrastructure, HPC big data, along with various cryptocurrency nodes, mining operations, and services. I\u2019ve also been mining and operating steem nodes since the launch in March 2016.

\n\n

I am one of the largest holders of STEEM independent of the original team, and my objective in operating a witness node includes working to ensure that the network is secure, reliable and scalable, in order to protect and grow the value of my stake. Although my identity is not public, I do not and have not ever operated any sock puppets or other misleading identities, and my five-year history in the community provides ample objective support for such a statement. Further, I state unequivocally that I have no affiliation with the steemit team, any of the steem developers, or any of the other witness operators outside of our normal online interactions. Thus I can promise that my witness node will be operated in fully-independent manner, faithful to the network rules and the best interests of my own stake and that of the other stakeholders.

\n\n

To that end I have provisioned redundant enterprise-class hardware in a low-latency Tier 2 datacenter at an undisclosed location. This is dedicated hardware, not deployed on AWS or another cloud. There is more than ample excess CPU, memory, and storage to support rapid network growth, and all can be easily scaled as needed. Standby hardware is already online for fail-over, and backup witness nodes at additional locations will be added. In addition I will be providing a full-time seed node, currently located in AWS Singapore (IP below). These are fully updated with latest patches from github.

\n\n
", "canonicalUrl": "https://hive.blog/witness-category/@smooth.witness/smooth-witness"},{"url": "https://hive.blog/hive-174578/@quochuy/my-personal-and-witness-introduction-to-the-hive-community", "probability": 0.9472063, "headline": "My personal and Witness introduction to the Hive community", "datePublished": "2020-04-24T04:27:02.500234", "datePublishedRaw": "3 years ago", "author": "quochuy", "authorsList": ["quochuy"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://images.hive.blog/DQmXb9ndJS4otUJPezmMYff6qeJzjzyqReXwk6cyxFhCJbn/2235_61977544739_102_n.jpg", "images": ["https://images.hive.blog/DQmT3nhTk8oTquGdhxANKjtTcv4FsQbBUriS7vmvDUcBXYW/13668821_10154398676664740_7265654488623454319_o.jpg", "https://images.hive.blog/640x0/https://images.hive.blog/DQmecYTZemY8SwU6rrg7awhpNRBsehUp7ivk9RGyZsUUpVw/quochuy-hive-witness.jpg", "https://images.hive.blog/DQmXb9ndJS4otUJPezmMYff6qeJzjzyqReXwk6cyxFhCJbn/2235_61977544739_102_n.jpg"], "description": "My name Lets start with my name as this is subject to much confusion \ud83d\ude02. My Vietnamese first name is Qu\u1ed1c Huy which means something like the emblem of the country. For short\u2026 by quochuy", "articleBody": "My name\n\nLets start with my name as this is subject to much confusion \ud83d\ude02. My Vietnamese first name is Qu\u1ed1c Huy which means something like the emblem of the country. For short, it's just Huy and the closest easy pronunciation would be the english name Huey. See the following Youtube video: How to pronounce Huy)\n\nMy journey\n\nSo I'm Vietnamese, born in Lao but I spent most of my childhood and teenage time in New Caledonia which is a French colony (oversea territory). So I speak Vietnamese, French and English (from middle/high school).\n\nAfter high school, I went to Montpellier in the South of France by the Mediterranean sea to study computer sciences and I spent 10 years of my life in that beautiful city. After graduating at Uni, I spent some times working for some french companies and then made my way to London where my career as a website developer was boosted and I ended up working couple of years for the Financial Times. In London, I married my wife and we had our son. No long after, we both decided to move to Australia where we believe life would be better for raising our child.\n\nI now live with my little family in Wyoming, NSW. It's a little town on the Central Coast in Australia. I'm still working as a (Senior) Website Developer for a national TV/Radio broadcasting company in their video on-demand department. I commute every day to Sydney and that takes me about 1h45 each way. However, recently, due to the COVID-19 threat, most of us are working from home.\n\nMy took my first step into crypto in February 2018 with Steem and I have now moved on to Hive. In March 2018, I decided to run a Witness node to help the network and learn more about the platform.\n\nMy Hive Witness\n\nI'm now running a Hive Witness and I'm currently ranked #35. Over the years, I was involved in several community projects on Steem and I'm now progressively rebuilding my \"portfolio\" on Hive.\n\nMy current projects on Hive are:\n\nMy Hive Witness node runs on a server with the following specs:\n\nSocial media", "articleBodyHtml": "
\n\n

My name

\n\n

Lets start with my name as this is subject to much confusion \ud83d\ude02. My Vietnamese first name is Qu\u1ed1c Huy which means something like the emblem of the country. For short, it's just Huy and the closest easy pronunciation would be the english name Huey. See the following Youtube video: How to pronounce Huy)

\n\n

My journey

\n\n

So I'm Vietnamese, born in Lao but I spent most of my childhood and teenage time in New Caledonia which is a French colony (oversea territory). So I speak Vietnamese, French and English (from middle/high school).

\n\n
\"Me
\n\n

After high school, I went to Montpellier in the South of France by the Mediterranean sea to study computer sciences and I spent 10 years of my life in that beautiful city. After graduating at Uni, I spent some times working for some french companies and then made my way to London where my career as a website developer was boosted and I ended up working couple of years for the Financial Times. In London, I married my wife and we had our son. No long after, we both decided to move to Australia where we believe life would be better for raising our child.

\n\n
\"My
\n\n

I now live with my little family in Wyoming, NSW. It's a little town on the Central Coast in Australia. I'm still working as a (Senior) Website Developer for a national TV/Radio broadcasting company in their video on-demand department. I commute every day to Sydney and that takes me about 1h45 each way. However, recently, due to the COVID-19 threat, most of us are working from home.

\n\n
\"Working
\n\n

My took my first step into crypto in February 2018 with Steem and I have now moved on to Hive. In March 2018, I decided to run a Witness node to help the network and learn more about the platform.

\n\n

My Hive Witness

\n\n
\"@quochuy
\n\n

I'm now running a Hive Witness and I'm currently ranked #35. Over the years, I was involved in several community projects on Steem and I'm now progressively rebuilding my \"portfolio\" on Hive.

\n\n

My current projects on Hive are:

\n\n

My Hive Witness node runs on a server with the following specs:

\n\n

Social media

\n\n
", "canonicalUrl": "https://hive.blog/hive-174578/@quochuy/my-personal-and-witness-introduction-to-the-hive-community"},{"url": "https://hive.blog/hive-174578/@ocd/original-content-decentralized-hive-statement", "probability": 0.9572922, "headline": "Original Content Decentralized Hive statement", "datePublished": "2020-04-24T04:27:02.440258", "datePublishedRaw": "3 years ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://steemitimages.com/640x0/https://img.esteem.app/pokyoo.jpg", "images": ["https://images.hive.blog/768x0/https://steemitimages.com/640x0/https://img.esteem.app/pokyoo.jpg", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/haterslines/EbgXHlRo-work-45972214_fair-cuppa_O16E3-sweat-C3A0-capuche-C3A9pais201.jpg", "https://images.hive.blog/768x0/https://files.steempeak.com/file/steempeak/yahialababidi/enzbjDDn-10462821_10154361632625574_1216153225888063318_n.jpg"], "description": "Hello dear Steemians, it's very sad that it has come to this where this will be the last time we'll call you that but with the actions of the recent buyer of Steemit.com and a\u2026 by ocd", "articleBody": "Hello dear Steemians, it's very sad that it has come to this where this will be the last time we'll call you that but with the actions of the recent buyer of Steemit.com and a lot of stake it is safe to assume that there's nothing decentralized about Steem anymore and won't be in the future neither. Not only does that defeat the purpose of OCD but adding to that there is also censorship now without DMCA notices on the Steemit front-end and it is time to say good bye.\n\nWith every chapter ending, there's a new one beginning and this one already starts out very exciting from the first page. We've always been a community witness and our main priority has been to distribute stake as wide as possible even with the centralized Steemit stake in existence we tried our best and it was really heartwarming to see the community band together and take back so many real witnesses on top. Now we will continue to do so on the Hive blockchain where stake is already way more distributed and the chain is a lot more decentralized and has close to zero chance of being taken over in the hostile manner it has been here. Many may say this is a flaw of DPOS but our community proved that when push comes to shove it is also one of its strengths and something that makes all of this possible with how advanced our chain is compared to the rest.\n\nI didn't just want to list all the changes so they are instead written in italics if that's all you're interested in finding out in this post.\n\nWith the ninjamined stake aside now there is a lot of new hope being generated and we are all looking forward to the future of Hive. From the snapshot tomorrow and forward the OCD team will only be nominating posts for the compilations on the Hive blockchain but we will be posting onto both blockchains until our stake here is gone. Since community goes first we will still be making good use of our voting power and trails here but seeing as that will constantly be evaporating with power downs I don't think there will be too much to curate in the near future. Content creators may also not feel as free to post here if they have to watch out what they say because a company might hide their content which is something many here today have left their prior centralized platforms for.\n\nWe will also be removing the rule of not cross-posting your own posts onto the OCD community on the Steem blockchain, so go crazy! :)\n\nI don't think there is a lot more to say although as I started writing this I thought there would be. We will be doing some more free hand curation onto the smaller communities we are supporting on the Steem blockchain but on Hive it will remain the same for the time being until we know how much stake we have to work with if we can increase the amount of daily posts from each of the communities we can support. There, that's another change I just remembered!\n\nBlockchain is beautiful, make the most of it and the power it gives users and stakeholders and we will see and curate you in the chain of freedom!\n\nA big thank you to everyone for these crazy 3 years of OCD curation and distribution! This is literally the best community I've ever been part of and I can't wait to see it grow and take over the world.", "articleBodyHtml": "
\n\n

Hello dear Steemians, it's very sad that it has come to this where this will be the last time we'll call you that but with the actions of the recent buyer of Steemit.com and a lot of stake it is safe to assume that there's nothing decentralized about Steem anymore and won't be in the future neither. Not only does that defeat the purpose of OCD but adding to that there is also censorship now without DMCA notices on the Steemit front-end and it is time to say good bye.

\n\n

With every chapter ending, there's a new one beginning and this one already starts out very exciting from the first page. We've always been a community witness and our main priority has been to distribute stake as wide as possible even with the centralized Steemit stake in existence we tried our best and it was really heartwarming to see the community band together and take back so many real witnesses on top. Now we will continue to do so on the Hive blockchain where stake is already way more distributed and the chain is a lot more decentralized and has close to zero chance of being taken over in the hostile manner it has been here. Many may say this is a flaw of DPOS but our community proved that when push comes to shove it is also one of its strengths and something that makes all of this possible with how advanced our chain is compared to the rest.

\n\n
\n\n

I didn't just want to list all the changes so they are instead written in italics if that's all you're interested in finding out in this post.

\n\n

With the ninjamined stake aside now there is a lot of new hope being generated and we are all looking forward to the future of Hive. From the snapshot tomorrow and forward the OCD team will only be nominating posts for the compilations on the Hive blockchain but we will be posting onto both blockchains until our stake here is gone. Since community goes first we will still be making good use of our voting power and trails here but seeing as that will constantly be evaporating with power downs I don't think there will be too much to curate in the near future. Content creators may also not feel as free to post here if they have to watch out what they say because a company might hide their content which is something many here today have left their prior centralized platforms for.

\n\n

We will also be removing the rule of not cross-posting your own posts onto the OCD community on the Steem blockchain, so go crazy! :)

\n\n

I don't think there is a lot more to say although as I started writing this I thought there would be. We will be doing some more free hand curation onto the smaller communities we are supporting on the Steem blockchain but on Hive it will remain the same for the time being until we know how much stake we have to work with if we can increase the amount of daily posts from each of the communities we can support. There, that's another change I just remembered!

\n\n

Blockchain is beautiful, make the most of it and the power it gives users and stakeholders and we will see and curate you in the chain of freedom!

\n\n

A big thank you to everyone for these crazy 3 years of OCD curation and distribution! This is literally the best community I've ever been part of and I can't wait to see it grow and take over the world.

\n\n
\n\n
", "canonicalUrl": "https://hive.blog/hive-174578/@ocd/original-content-decentralized-hive-statement"},{"url": "https://hive.blog/witness-category/@mahdiyari/my-updated-witness-post", "probability": 0.8633053, "headline": "My updated witness post", "datePublished": "2023-03-30T04:27:18.001470", "datePublishedRaw": "25 days ago", "author": "mahdiyari", "authorsList": ["mahdiyari"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/mahdiyari/243MLLyzCaQE9vovb7cRvXsurxG5bxcfZPnueaeudLFwreSD89XyCQLphTFVafGjUwsJu.jpg", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/mahdiyari/243MLLyzCaQE9vovb7cRvXsurxG5bxcfZPnueaeudLFwreSD89XyCQLphTFVafGjUwsJu.jpg", "https://images.hive.blog/DQmSWfbie9MTC172sENiA16bsMaz1ofT6AAyTo1ishasrcX/winexcomment.png"], "description": "My past, present, and future on Hive in a short post. by mahdiyari", "articleBody": "My past, present, and future on Hive in a short post.\n\nPast\n\nStarted my witness server in August 2017\nLaunched a hive node node.mahdiyari.info:2001 - Ongoing\nGot my first produced block after 17 days 14,970,426\nLaunched Steemfollower (discoverability tool) - Died after 1 or 2 years\nLaunched hive.vote October 2017 - Ongoing\nSteemclient - never launched\nDblog - never launched\nFrom 2018 to late 2019 smaller projects and maintenance of the ongoing projects\nLaunched hive-tx library in November 2019\nMigrating hive-js, hive-tx, and other projects to Hive after Hive fork\nPosting dev guides and migrating libraries after HF25\nSmartchain - never launched\nLaunched hive-PHP library in July 2022\nLaunched hivedex.io January 2023\nLaunched a public RPC node https://rpc.mahdiyari.info\nBig update on hive-tx library\n\nAnd the last one should be the account creation service on hivedex.io/signup\n\nThis list is development only\n\nPresent\n\nIt has been 5 and a half years since I started my witness. I don't post frequently and I don't work as hard as I should. But thanks to you I'm #25 in the witness ranking. I think it is a pretty good spot and I'm very happy that you trust me with your votes.\n\nWhile I'm not top 20, I still try my best to be present and put forward my opinion on any matter that concerns the blockchain and its features.\n\nI try to provide support and guidance in the various discord servers to the new and old users related to general Hive matters or development.\n\nFuture\n\nTBD\n\nI can't let you go without a cute kitten.\n\nI believe the best \"thank you\" is to continue doing what I do. Continuously contributing to the Hive ecosystem.", "articleBodyHtml": "
\n\n

My past, present, and future on Hive in a short post.

\n\n

Past

\n\n
  • Started my witness server in August 2017
  • \n
  • Launched a hive node node.mahdiyari.info:2001 - Ongoing
  • \n
  • Got my first produced block after 17 days 14,970,426
  • \n
  • Launched Steemfollower (discoverability tool) - Died after 1 or 2 years
  • \n
  • Launched hive.vote October 2017 - Ongoing
  • \n
  • Steemclient - never launched
  • \n
  • Dblog - never launched
  • \n
  • From 2018 to late 2019 smaller projects and maintenance of the ongoing projects
  • \n
  • Launched hive-tx library in November 2019
  • \n
  • Migrating hive-js, hive-tx, and other projects to Hive after Hive fork
  • \n
  • Posting dev guides and migrating libraries after HF25
  • \n
  • Smartchain - never launched
  • \n
  • Launched hive-PHP library in July 2022
  • \n
  • Launched hivedex.io January 2023
  • \n
  • Launched a public RPC node https://rpc.mahdiyari.info
  • \n
  • Big update on hive-tx library
\n\n

And the last one should be the account creation service on hivedex.io/signup

\n\n

This list is development only

\n\n

Present

\n\n

It has been 5 and a half years since I started my witness. I don't post frequently and I don't work as hard as I should. But thanks to you I'm #25 in the witness ranking. I think it is a pretty good spot and I'm very happy that you trust me with your votes.

\n\n

While I'm not top 20, I still try my best to be present and put forward my opinion on any matter that concerns the blockchain and its features.

\n\n

I try to provide support and guidance in the various discord servers to the new and old users related to general Hive matters or development.

\n\n

Future

\n\n

TBD

\n\n

I can't let you go without a cute kitten.

\n\n
\"kitten-hive.jpg\"
\n\n

I believe the best \"thank you\" is to continue doing what I do. Continuously contributing to the Hive ecosystem.

\n\n
", "canonicalUrl": "https://peakd.com/witness-category/@mahdiyari/my-updated-witness-post"},{"url": "https://hive.blog/witness/@anyx/updated-witness-application", "probability": 0.96739787, "headline": "Updated Witness Application", "datePublished": "2019-04-24T04:27:19.133585", "datePublishedRaw": "4 years ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://i.imgur.com/RXRl5RT.png", "images": ["https://images.hive.blog/768x0/https://i.imgur.com/aU10Eid.jpg", "https://images.hive.blog/768x0/https://i.imgur.com/RXRl5RT.png", "https://images.hive.blog/768x0/https://steemitimages.com/u/cheetah/avatar", "https://images.hive.blog/768x0/https://steemitimages.com/u/steemcleaners/avatar"], "description": "As an academic at heart, I find myself often at odds with the \"popularity contest\" that Steem witness positions most often converge to. While I\u2019m a quiet person more often than\u2026 by anyx", "articleBody": "As an academic at heart, I find myself often at odds with the \"popularity contest\" that Steem witness positions most often converge to. While I\u2019m a quiet person more often than not, as @lukestokes has often pointed out to me that without information being neatly presented in a consumable fashion, most voters often don\u2019t have the time to fully research the reasons they should vote for someone -- and hence, word of mouth and popularity reign supreme.\n\nI have a hard time writing about myself (I often believe actions speak louder than words), but the goal of this post is to give an updated account of why you, as a holder of Steem, should vote for me as a witness; a bite size summary of the value that I offer and why I believe it\u2019s important. Since it\u2019s been quite some time since my original witness application, this updated post should help modernize information about my contributions.\n\nTo get started, it's crucial to understand the election process in Steem: the witnesses that you vote for are the ones who determine consensus of the ecosystem. They are, as I like to put it, the custodians of the blockchain. As a voter, you are delegating your trust to these actors. Therefore, the primary consideration in your votes should be your level of trust in that actor: specifically, trust for their ability to enforce security of the system.\n\nAn Academic and Technical Background\n\nTo start off, I\u2019ll provide some background about me. I am currently a PhD Candidate in Computer Engineering. My research topic for my dissertation is mostly on Distributed Systems, Parallelism, and Graph Processing. I also have done quite a bit of Machine Learning, though outside my dissertation. That might sound a bit technical, so to simplify things: my field is mostly about understanding, building, and analyzing systems and data. Steem is, in fact, an excellent example of such a system: it is both a distributed system, and its data can be represented as a time series graph. It\u2019s right up my alley.\n\nAs an academic, I'm a uniquely qualified expert in this domain: I have authored multiple peer reviewed publications at conferences such as Supercomputing and IPDPS (International Parallel and Distributed Processing Symposium). These conferences are highly regarded in the field of Computer Science and Engineering. Furthermore, I've previously worked in the research divisions of top tech companies like Intel and Facebook. As part of my graduate studies, I even help teach university courses on Distributed Systems. While many people in the cryptocurrency space certainly claim domain expertise, I do have the credentials to back it up.\n\nWhy is this background important or relevant? Well, simply put, I have expert knowledge in how Steem works, at its fundamentals. The ability to understand and identify security at its core is, in my opinion, crucial to the success of protecting the network. As an example, I published this post on Vote Incentivization and how it degrades Delegated Proof of Stake, a serious issue that can arise in DPoS.\n\nFurther, my expertise as a Computer Engineer (both academic and in industry) is also a key point: I speak C++, the language that the Steem software is implemented in. The value of the ability to identify, track, and repair issues in the code cannot be understated.\n\nAs an example, I recently discovered and fixed an issue in steem that caused witnesses to miss blocks seemingly randomly. The issue was in how the P2P protocol identified and handled error scenarios. I patched this issue, and most witnesses are already running this patch. Fixing issues like this is often done quietly and in the background: you, the average user, most often do not care about these kinds of software bugs. However, you should most certainly entrust your votes to people who can fix anything broken, so that the system you use can continue to operate smoothly. Now, obviously, I cannot claim or promise to be able to solve unknown future security issues, but I do believe having the right the skillset should be an important consideration when voting.\n\nOf course, speaking the language of the code has also allowed me to implement or port features into Steem. I recently added to Steem the feature of using Unix sockets for API nodes. Again, this is a technical thing, and one that's certainly not interesting to the average user. Yet, none the less, it is progress that goes on in the background to improve the performance of the Steem ecosystem.\n\nFinally, my technical background allows me to operate and run on dedicated computer hardware, rather than rent servers from a third party, like most witnesses do (opting for custodian services like Hetzner or OVH). A common phrase in the crypto community is \u201cnot your keys, not your crypto\u201d -- and in my opinion, this extends to the ownership of the hardware securing the keys, too. My hardware is proudly Canadian, owned and operated securely by me, in the great white north.\n\nRecently, a large amount of pressure and questions have risen from the community in the form of how decentralized the Steem ecosystem truly is. As some people are acutely aware, Steemit Inc has de-facto control over the Steem blockchain, and our witness positions exist simply because they currently allow it. With recent talks about \u201cforking out Steemit stake,\u201d something to be made clear is that such a fork would not be \u201cSteem\u201d. Although I am a staunch supporter of improving decentralization, I am not convinced this is the right method to do so. Yet, I do agree that actions should be taken to reduce reliance on Steemit Inc., and I have done and will continue to push this.\n\nI have already put in a lot of effort and made strides to improve the decentralization of the Steem ecosystem and reduce this reliance; in a few blog posts (What Makes a \u2018dApp\u2019 a \u2018dApp\u2019? and Fully Decentralizing dApps), I previously outlined the importance and ability to have fully decentralized applications.\n\nHowever, this is not all just talk or discussion. In my efforts to promote this, I launched a public full-node infrastructure for any application to use. I purchased tens of thousands of dollars\u2019 worth of hardware, and pay hosting fees of over $300/month to support this infrastructure, for the sole reason of improving the decentralization of Steem. As a result, this infrastructure has proven itself to be a reliable and performant alternative to Steemit Inc's own provided infrastructure.\n\nWhile the benefits of my push towards decentralization are often not visible to the end users, the impact to the ecosystem is already clear. As an example, the @steemmonsters application has moved over to run on this infrastructure, enabling them to become completely independent of Steemit Inc. Did you know @steemmonsters does more Steem transactions than Steemit does? Further, it is one of the few resources left that supports the popular Steem desktop wallet, Vessel.\n\nNot only does this operation requires continuous maintenance, but I still am working towards improving its performance and ensuring a reliable alternative to the current centralized Steemit Inc. solution. For example, my infrastructure design has already discarded Steemit\u2019s \u201cJussi\u201d in favour of my own custom solution built in Golang.\n\nDemonstrating Trust: My Commitment to Reducing Fraud\n\nAs the original founder of the @steemcleaners project, I remain the primary manager of the group, and the primary supporter of its financing -- regularly redistributing hundreds if not thousands of dollars weekly to community helpers. This organization has existed for over two and a half years, working daily to help reduce fraud, identity theft, and plagiarism here on Steem. While sometimes controversial, we regularly interact with the community, and those familiar with the level of work being done can attest to positive impact it has had and continues to provide.\n\nAs many also already know, I developed and continue to maintain the @cheetah bot, a service that looks through a post for similar content already existing on the web, in an effort to expose potential fraud. @cheetah is unfortunately quite expensive: the ability to scour the whole web for matching content does not come cheap. Historically, @cheetah 's cost varies with activity on the steem blockchain -- i.e., how many posts she needs to review -- costing over $200 per day in some cases. Despite the current downturn in activity due to the cryptocurrency bear market we are in, she still costs about $50 per day to operate ($1500/month), or about twice the revenue generated by the logs posted on the account. The rest of the financing is subsidized by my witness pay.\n\nAnother service I provide is @guard, a rapid response bot that identifies malicious links (such as to phishing sites) and informs users of the issue. Security of your cryptocurrency keys is a huge barrier for most people, and limiting the spread of phishing is crucial to wider adoption.\n\nWhile these anti-fraud initiatives that I spearhead are admittedly not perfect, I have continued to ensure that they improve over time. While these systems are not able to censor or remove content on Steem, their goal is instead to inform users and reduce the incentives for fraud in our community. I believe they are successful, and have gone a long way to reduce fraud, phishing, and plagiarism on this platform, and continue to do so every day.\n\nTL;DR / Summary.\n\nMy technical and academic background as a PhD Candidate in Computer Engineering, with actual domain expertise, I believe uniquely qualifies me as a trusted actor to secure the Steem blockchain. Further, I have demonstrated my ability to author software fixes and feature implementations on top of Steem.\n\nMy push for decentralization has not only been talk, but also show, by providing infrastructure support that has already enabled applications like @steemmonsters to entirely remove their reliance on Steemit Inc.\n\nFinally, my hard stance on fraud and my initiatives to address it may come controversial to some, but for many, having a system in place that attempts to address it allows other projects to spend more time on their positive and constructive goals.\n\nWhile a quiet person in general, I will continue to work in the background to improve the Steem ecosystem. If I\u2019ve convinced you to vote for me, I\u2019d encourage you to do so, here.", "articleBodyHtml": "
\n\n

As an academic at heart, I find myself often at odds with the \"popularity contest\" that Steem witness positions most often converge to. While I\u2019m a quiet person more often than not, as @lukestokes has often pointed out to me that without information being neatly presented in a consumable fashion, most voters often don\u2019t have the time to fully research the reasons they should vote for someone -- and hence, word of mouth and popularity reign supreme.

\n\n

I have a hard time writing about myself (I often believe actions speak louder than words), but the goal of this post is to give an updated account of why you, as a holder of Steem, should vote for me as a witness; a bite size summary of the value that I offer and why I believe it\u2019s important. Since it\u2019s been quite some time since my original witness application, this updated post should help modernize information about my contributions.

\n\n

To get started, it's crucial to understand the election process in Steem: the witnesses that you vote for are the ones who determine consensus of the ecosystem. They are, as I like to put it, the custodians of the blockchain. As a voter, you are delegating your trust to these actors. Therefore, the primary consideration in your votes should be your level of trust in that actor: specifically, trust for their ability to enforce security of the system.

\n\n

An Academic and Technical Background

\n\n
\n\n

To start off, I\u2019ll provide some background about me. I am currently a PhD Candidate in Computer Engineering. My research topic for my dissertation is mostly on Distributed Systems, Parallelism, and Graph Processing. I also have done quite a bit of Machine Learning, though outside my dissertation. That might sound a bit technical, so to simplify things: my field is mostly about understanding, building, and analyzing systems and data. Steem is, in fact, an excellent example of such a system: it is both a distributed system, and its data can be represented as a time series graph. It\u2019s right up my alley.

\n\n

As an academic, I'm a uniquely qualified expert in this domain: I have authored multiple peer reviewed publications at conferences such as Supercomputing and IPDPS (International Parallel and Distributed Processing Symposium). These conferences are highly regarded in the field of Computer Science and Engineering. Furthermore, I've previously worked in the research divisions of top tech companies like Intel and Facebook. As part of my graduate studies, I even help teach university courses on Distributed Systems. While many people in the cryptocurrency space certainly claim domain expertise, I do have the credentials to back it up.

\n\n

Why is this background important or relevant? Well, simply put, I have expert knowledge in how Steem works, at its fundamentals. The ability to understand and identify security at its core is, in my opinion, crucial to the success of protecting the network. As an example, I published this post on Vote Incentivization and how it degrades Delegated Proof of Stake, a serious issue that can arise in DPoS.

\n\n

Further, my expertise as a Computer Engineer (both academic and in industry) is also a key point: I speak C++, the language that the Steem software is implemented in. The value of the ability to identify, track, and repair issues in the code cannot be understated.

\n\n

As an example, I recently discovered and fixed an issue in steem that caused witnesses to miss blocks seemingly randomly. The issue was in how the P2P protocol identified and handled error scenarios. I patched this issue, and most witnesses are already running this patch. Fixing issues like this is often done quietly and in the background: you, the average user, most often do not care about these kinds of software bugs. However, you should most certainly entrust your votes to people who can fix anything broken, so that the system you use can continue to operate smoothly. Now, obviously, I cannot claim or promise to be able to solve unknown future security issues, but I do believe having the right the skillset should be an important consideration when voting.

\n\n

Of course, speaking the language of the code has also allowed me to implement or port features into Steem. I recently added to Steem the feature of using Unix sockets for API nodes. Again, this is a technical thing, and one that's certainly not interesting to the average user. Yet, none the less, it is progress that goes on in the background to improve the performance of the Steem ecosystem.

\n\n

Finally, my technical background allows me to operate and run on dedicated computer hardware, rather than rent servers from a third party, like most witnesses do (opting for custodian services like Hetzner or OVH). A common phrase in the crypto community is \u201cnot your keys, not your crypto\u201d -- and in my opinion, this extends to the ownership of the hardware securing the keys, too. My hardware is proudly Canadian, owned and operated securely by me, in the great white north.

\n\n
\n\n

Recently, a large amount of pressure and questions have risen from the community in the form of how decentralized the Steem ecosystem truly is. As some people are acutely aware, Steemit Inc has de-facto control over the Steem blockchain, and our witness positions exist simply because they currently allow it. With recent talks about \u201cforking out Steemit stake,\u201d something to be made clear is that such a fork would not be \u201cSteem\u201d. Although I am a staunch supporter of improving decentralization, I am not convinced this is the right method to do so. Yet, I do agree that actions should be taken to reduce reliance on Steemit Inc., and I have done and will continue to push this.

\n\n

I have already put in a lot of effort and made strides to improve the decentralization of the Steem ecosystem and reduce this reliance; in a few blog posts (What Makes a \u2018dApp\u2019 a \u2018dApp\u2019? and Fully Decentralizing dApps), I previously outlined the importance and ability to have fully decentralized applications.

\n\n

However, this is not all just talk or discussion. In my efforts to promote this, I launched a public full-node infrastructure for any application to use. I purchased tens of thousands of dollars\u2019 worth of hardware, and pay hosting fees of over $300/month to support this infrastructure, for the sole reason of improving the decentralization of Steem. As a result, this infrastructure has proven itself to be a reliable and performant alternative to Steemit Inc's own provided infrastructure.

\n\n

While the benefits of my push towards decentralization are often not visible to the end users, the impact to the ecosystem is already clear. As an example, the @steemmonsters application has moved over to run on this infrastructure, enabling them to become completely independent of Steemit Inc. Did you know @steemmonsters does more Steem transactions than Steemit does? Further, it is one of the few resources left that supports the popular Steem desktop wallet, Vessel.

\n\n

Not only does this operation requires continuous maintenance, but I still am working towards improving its performance and ensuring a reliable alternative to the current centralized Steemit Inc. solution. For example, my infrastructure design has already discarded Steemit\u2019s \u201cJussi\u201d in favour of my own custom solution built in Golang.

\n\n

Demonstrating Trust: My Commitment to Reducing Fraud

\n\n

\n\n

As the original founder of the @steemcleaners project, I remain the primary manager of the group, and the primary supporter of its financing -- regularly redistributing hundreds if not thousands of dollars weekly to community helpers. This organization has existed for over two and a half years, working daily to help reduce fraud, identity theft, and plagiarism here on Steem. While sometimes controversial, we regularly interact with the community, and those familiar with the level of work being done can attest to positive impact it has had and continues to provide.

\n\n

As many also already know, I developed and continue to maintain the @cheetah bot, a service that looks through a post for similar content already existing on the web, in an effort to expose potential fraud. @cheetah is unfortunately quite expensive: the ability to scour the whole web for matching content does not come cheap. Historically, @cheetah's cost varies with activity on the steem blockchain -- i.e., how many posts she needs to review -- costing over $200 per day in some cases. Despite the current downturn in activity due to the cryptocurrency bear market we are in, she still costs about $50 per day to operate ($1500/month), or about twice the revenue generated by the logs posted on the account. The rest of the financing is subsidized by my witness pay.

\n\n

Another service I provide is @guard, a rapid response bot that identifies malicious links (such as to phishing sites) and informs users of the issue. Security of your cryptocurrency keys is a huge barrier for most people, and limiting the spread of phishing is crucial to wider adoption.

\n\n

While these anti-fraud initiatives that I spearhead are admittedly not perfect, I have continued to ensure that they improve over time. While these systems are not able to censor or remove content on Steem, their goal is instead to inform users and reduce the incentives for fraud in our community. I believe they are successful, and have gone a long way to reduce fraud, phishing, and plagiarism on this platform, and continue to do so every day.

\n\n

TL;DR / Summary.

\n\n

My technical and academic background as a PhD Candidate in Computer Engineering, with actual domain expertise, I believe uniquely qualifies me as a trusted actor to secure the Steem blockchain. Further, I have demonstrated my ability to author software fixes and feature implementations on top of Steem.

\n\n

My push for decentralization has not only been talk, but also show, by providing infrastructure support that has already enabled applications like @steemmonsters to entirely remove their reliance on Steemit Inc.

\n\n

Finally, my hard stance on fraud and my initiatives to address it may come controversial to some, but for many, having a system in place that attempts to address it allows other projects to spend more time on their positive and constructive goals.

\n\n

While a quiet person in general, I will continue to work in the background to improve the Steem ecosystem. If I\u2019ve convinced you to vote for me, I\u2019d encourage you to do so, here.

\n\n
", "canonicalUrl": "https://steemit.com/witness/@anyx/updated-witness-application"},{"url": "https://hive.blog/witness-update/@someguy123/welcome-hard-fork-20-including-update-info-for-steem-in-a-box-users", "probability": 0.65210366, "headline": "Welcome, Hard Fork 20! (including update info for Steem-in-a-box users)", "datePublished": "2018-04-24T04:27:22.860960", "datePublishedRaw": "5 years ago", "dateModified": "2018-04-24T04:27:22.857730", "dateModifiedRaw": "5 years ago", "author": "@stellabelle", "authorsList": ["@stellabelle"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://cdn.steemitimages.com/DQmYnW6uRhezd33jfWZu1RQ1nNyrAk8nHXvH6thHfyJS3L9/image.png", "images": ["https://images.hive.blog/768x0/https://cdn.steemitimages.com/DQmYnW6uRhezd33jfWZu1RQ1nNyrAk8nHXvH6thHfyJS3L9/image.png"], "description": "At 3 PM UTC today (25 Sep 2018), Hard Fork 20 was activated on the Steem network. Any Steem server not running HF20 was promptly disconnected from the network, and the top 20\u2026 by someguy123", "articleBody": "At 3 PM UTC today (25 Sep 2018), Hard Fork 20 was activated on the Steem network.\n\nAny Steem server not running HF20 was promptly disconnected from the network, and the top 20 are now all producing blocks on HF20.\n\nUnfortunately due to the new Resource Credits system, some people are unable to post, upvote, or even adjust their witness key. This will resolve itself over the next few days as the RC system reaches equilibrium.\n\nHow can I update my witness to HF20?\n\nOnce your config is updated, make sure you're on the right branch:\n\ncd ~/steem-docker git checkout master git pull\n\nFinally, the same as every other Steem-in-a-box update:\n\n./run.sh install ./run.sh stop ./run.sh replay\n\nOnce replayed, you'll be on HF20.\n\nFresh install and upgrade instructions at https://steemit.com/witness-category/@someguy123/the-easy-way-to-install-or-upgrade-to-hardfork-20-steem-in-a-box are still valid for 0.20.2\n\nHF20 version 0.20.2 has been available for several days in Steem-in-a-box, and instructions were made available earlier in the witness channel on STEEM.CHAT.\n\nGIF Avatar by\n\nDo you like what I'm doing for STEEM/Steemit?\n\nVote for me to be a witness - every vote counts.\n\nDon't forget to follow me for more like this.\n\nHave you ever thought about being a witness yourself? Join the witness channel. We're happy to guide you! Join in shaping the STEEM economy.", "articleBodyHtml": "
\n\n

At 3 PM UTC today (25 Sep 2018), Hard Fork 20 was activated on the Steem network.

\n\n

Any Steem server not running HF20 was promptly disconnected from the network, and the top 20 are now all producing blocks on HF20.

\n\n

Unfortunately due to the new Resource Credits system, some people are unable to post, upvote, or even adjust their witness key. This will resolve itself over the next few days as the RC system reaches equilibrium.

\n\n

How can I update my witness to HF20?

\n\n
\n\n

Once your config is updated, make sure you're on the right branch:

\n\n
cd ~/steem-docker\ngit checkout master\ngit pull\n
\n\n

Finally, the same as every other Steem-in-a-box update:

\n\n
./run.sh install\n./run.sh stop\n./run.sh replay\n
\n\n

Once replayed, you'll be on HF20.

\n\n

Fresh install and upgrade instructions at https://steemit.com/witness-category/@someguy123/the-easy-way-to-install-or-upgrade-to-hardfork-20-steem-in-a-box are still valid for 0.20.2

\n\n

HF20 version 0.20.2 has been available for several days in Steem-in-a-box, and instructions were made available earlier in the witness channel on STEEM.CHAT.

\n\n
\n\n

GIF Avatar by

\n\n

Do you like what I'm doing for STEEM/Steemit?

\n\n

Vote for me to be a witness - every vote counts.

\n\n

Don't forget to follow me for more like this.

\n\n

Have you ever thought about being a witness yourself? Join the witness channel. We're happy to guide you! Join in shaping the STEEM economy.

\n\n
", "canonicalUrl": "https://steemit.com/witness-update/@someguy123/welcome-hard-fork-20-including-update-info-for-steem-in-a-box-users"},{"url": "https://hive.blog/hive-112019/@spknetwork/zznhxmqq?sort=trending#comments", "probability": 0.8032649, "headline": "SPK Network Team Meeting Recording #1", "datePublished": "2023-04-21T04:27:50.004798", "datePublishedRaw": "3 days ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafybeieqfz64zubiky6awfgyvvjanigm6rno5dg2jcj4nv7xkr2p22i3mu", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tmDmbuy6SvS8ktfSayxCGeNgNEog6KKWXrT66uSc7EXnfsqy3QMqyiNQ6acE6NFiCS5.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png"], "description": "\u25b6\ufe0f Watch on 3Speak This is the recording of today's meeting. We plan to record every week on Thursdays at 20:00 UTC, so stay tuned. The live meeting occurs\u2026 by spknetwork", "articleBody": "\u25b6\ufe0f Watch on 3Speak", "articleBodyHtml": " ", "canonicalUrl": "https://hive.blog/hive-112019/@spknetwork/zznhxmqq"},{"url": "https://hive.blog/hive-112019/@spknetwork/zznhxmqq?sort=votes#comments", "probability": 0.8032649, "headline": "SPK Network Team Meeting Recording #1", "datePublished": "2023-04-21T04:27:50.495567", "datePublishedRaw": "3 days ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafybeieqfz64zubiky6awfgyvvjanigm6rno5dg2jcj4nv7xkr2p22i3mu", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tmDmbuy6SvS8ktfSayxCGeNgNEog6KKWXrT66uSc7EXnfsqy3QMqyiNQ6acE6NFiCS5.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png"], "description": "\u25b6\ufe0f Watch on 3Speak This is the recording of today's meeting. We plan to record every week on Thursdays at 20:00 UTC, so stay tuned. The live meeting occurs\u2026 by spknetwork", "articleBody": "\u25b6\ufe0f Watch on 3Speak", "articleBodyHtml": " ", "canonicalUrl": "https://hive.blog/hive-112019/@spknetwork/zznhxmqq"},{"url": "https://hive.blog/hive-112019/@spknetwork/zznhxmqq?sort=new#comments", "probability": 0.8032649, "headline": "SPK Network Team Meeting Recording #1", "datePublished": "2023-04-21T04:27:51.520141", "datePublishedRaw": "3 days ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://ipfs-3speak.b-cdn.net/ipfs/bafybeieqfz64zubiky6awfgyvvjanigm6rno5dg2jcj4nv7xkr2p22i3mu", "images": ["https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tmDmbuy6SvS8ktfSayxCGeNgNEog6KKWXrT66uSc7EXnfsqy3QMqyiNQ6acE6NFiCS5.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/spknetwork/23tkpCz5fuuTPRBFcbQz9ihncoGt7qVFEhxiWB6AYesGjBzkPfaZcDNxerj4vbq575nZe.png"], "description": "\u25b6\ufe0f Watch on 3Speak This is the recording of today's meeting. We plan to record every week on Thursdays at 20:00 UTC, so stay tuned. The live meeting occurs\u2026 by spknetwork", "articleBody": "\u25b6\ufe0f Watch on 3Speak", "articleBodyHtml": " ", "canonicalUrl": "https://hive.blog/hive-112019/@spknetwork/zznhxmqq"},{"url": "https://hive.blog/steem/@neoxian/neoxian-witness-application", "probability": 0.6833244, "headline": "Neoxian: witness application", "datePublished": "2018-04-24T04:27:56.061038", "datePublishedRaw": "5 years ago", "author": "neoxian", "authorsList": ["neoxian"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://4.bp.blogspot.com/_f6vPFbisEsA/TKzq1d4sCeI/AAAAAAAAC1s/JXQf81x4RY0/s320/pedro.jpg", "images": ["https://images.hive.blog/768x0/https://4.bp.blogspot.com/_f6vPFbisEsA/TKzq1d4sCeI/AAAAAAAAC1s/JXQf81x4RY0/s320/pedro.jpg"], "description": "Hey all, I'm doing this mostly because the folks at MSP (Minnow Support Group) and PAL asked me to. Here is my witness application: I am Neoxian. Vote for me. ....\u2026 by neoxian", "articleBody": "Hey all,\n\nI'm doing this mostly because the folks at MSP (Minnow Support Group) and PAL asked me to.\n\nHere is my witness application:\n\nI am Neoxian. Vote for me.\n\n....\n\n....\n\n....\n\nAlright, I suppose I should say a little more.\n\nReasons to vote for me:\n\nI've been here since last July 2016 and I'm heavily invested would like to see Steem do well.\n\nI'm pretty active on the chain, posting, curating, flagging spam and plagiarism.\n\nI've run various businesses on the blockchain, to help demonstrate the business can be done in Steem. I was a pioneer being one of the first to rent delegations which has now become big business here. I have also loaned money to various people in need, helping them out of a tight spot.\n\nVote for me, and I'll make all your dreams come true.\n(Pedro pic found on the internets)", "articleBodyHtml": "
\n\n

Hey all,

\n\n

I'm doing this mostly because the folks at MSP (Minnow Support Group) and PAL asked me to.

\n\n

Here is my witness application:

\n\n

I am Neoxian. Vote for me.

\n\n

....

\n\n

....

\n\n

....

\n\n

Alright, I suppose I should say a little more.

\n\n

Reasons to vote for me:

\n\n
  • I've been here since last July 2016 and I'm heavily invested would like to see Steem do well.

  • \n
  • I'm pretty active on the chain, posting, curating, flagging spam and plagiarism.

  • \n
  • I've run various businesses on the blockchain, to help demonstrate the business can be done in Steem. I was a pioneer being one of the first to rent delegations which has now become big business here. I have also loaned money to various people in need, helping them out of a tight spot.

\n\n

Vote for me, and I'll make all your dreams come true.
\n
\n(Pedro pic found on the internets)

\n\n
\"Neoxian-FINAL-FRAME2.gif\"
\n\n
", "canonicalUrl": "https://steemit.com/steem/@neoxian/neoxian-witness-application"},{"url": "https://hive.blog/witness-category/@fbslo/witness-re-announcement", "probability": 0.95508444, "headline": "Witness (re)Announcement", "datePublished": "2020-04-24T04:27:58.780822", "datePublishedRaw": "3 years ago", "author": "fbslo", "authorsList": ["fbslo"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://images.hive.blog/DQmfVEivevHedMRTNhxSyZjEeTBAxERLSV7SUa3WVwvNX41/social_hive_light.jpg", "images": ["https://images.hive.blog/DQmfVEivevHedMRTNhxSyZjEeTBAxERLSV7SUa3WVwvNX41/social_hive_light.jpg"], "description": "Witness (re)Announcement After 1 year and 4 months, I'm officially back in the witness game! To understand why I have decided to come back, you should probably\u2026 by fbslo", "articleBody": "Witness (re) Announcement\n\nAfter 1 year and 4 months, I'm officially back in the witness game! To understand why I have decided to come back, you should probably understand why I left in the first place.\n\nI wrote my witness resignation post, where I described the issues I had, but mainly it was a problem with the low price of STEEM, high server costs, and PayPal banning my account. I also lost my passion for the project, Steemit Inc. was dumping hundreds of thousands of STEEM every month.\n\nWhile Hive still has some of the original problems, I have much more faith in its success. My personal financial situation has also improved, so now I can afford to run a witness server again, plus recent changes to hived software made it easier to run on cheaper hardware (my server has only 32 GB RAM instead of 64 GB).\n\nI was thinking about starting a witness again since the birth of Hive Blockchain, but I was unsure if I want to go back to the politics game (not many of you know this, since , but I suffer from social anxiety, so campaigning for votes is not my favourite thing to do). But after a recent Twitter poll, I was encouraged.\n\nI'm running Hived v1.24.2\n\nBackup ^\n\n^not online 24/7, but can be added fast if the main server goes offline\n\nGoals & Projects\n\nDecentralization: Supporting decentralization and stability of Hive blockchain (as every witness does).\n\nWrapped Hive *: We are working on a decentralized system that would replace current \"custodial\" one. I'm hoping to have mainnet running before December, but since we changed our original plans (one central node and validators) and replaced it with even more decentralized (only validators, each validator acts as a central node for some time), it might take longer.\nWrapped Hive Engine tokens: Since a very successful launch of wLEO, some other HE tokens are also working on getting their own wTokens. If you are interested, send me a message on discord. P.S. Code is open-source!\n\n* I have received compensation for these projects.\n\nThe current main goal I would like to achieve are:", "articleBodyHtml": "
\n\n

Witness (re)Announcement

\n\n
\"social_hive_light.jpg\"
\n\n


\n\n

After 1 year and 4 months, I'm officially back in the witness game! To understand why I have decided to come back, you should probably understand why I left in the first place.

\n\n

I wrote my witness resignation post, where I described the issues I had, but mainly it was a problem with the low price of STEEM, high server costs, and PayPal banning my account. I also lost my passion for the project, Steemit Inc. was dumping hundreds of thousands of STEEM every month.

\n\n

While Hive still has some of the original problems, I have much more faith in its success. My personal financial situation has also improved, so now I can afford to run a witness server again, plus recent changes to hived software made it easier to run on cheaper hardware (my server has only 32 GB RAM instead of 64 GB).

\n\n

I was thinking about starting a witness again since the birth of Hive Blockchain, but I was unsure if I want to go back to the politics game (not many of you know this, since \"slika.png\", but I suffer from social anxiety, so campaigning for votes is not my favourite thing to do). But after a recent Twitter poll, I was encouraged.

\n\n

I'm running Hived v1.24.2

\n\n

Backup ^

\n\n

^not online 24/7, but can be added fast if the main server goes offline

\n\n

Goals & Projects

\n\n
  • Decentralization: Supporting decentralization and stability of Hive blockchain (as every witness does).
\n\n


\n\n
  • Wrapped Hive *: We are working on a decentralized system that would replace current \"custodial\" one. I'm hoping to have mainnet running before December, but since we changed our original plans (one central node and validators) and replaced it with even more decentralized (only validators, each validator acts as a central node for some time), it might take longer.
  • \n
  • Wrapped Hive Engine tokens: Since a very successful launch of wLEO, some other HE tokens are also working on getting their own wTokens. If you are interested, send me a message on discord. P.S. Code is open-source!
\n\n

* I have received compensation for these projects.

\n\n

The current main goal I would like to achieve are:

\n\n
", "canonicalUrl": "https://hive.blog/witness-category/@fbslo/witness-re-announcement"},{"url": "https://hive.blog/hive-112019/@poshtoken/re-spknetwork-zznhxmqq145961103", "probability": 0.5576676, "headline": "RE: SPK Network Team Meeting Recording #1", "datePublished": "2023-04-21T04:27:59.490028", "datePublishedRaw": "3 days ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://files.peakd.com/file/peakd-hive/poshtoken/poshcoin.png", "description": "The rewards earned on this comment will go directly to the people( @yeckingo1, @jomancub ) sharing the post on Twitter as long as they are registered with @poshtoken. Sign up at by poshtoken", "canonicalUrl": "https://hive.blog/hive-112019/@poshtoken/re-spknetwork-zznhxmqq145961103"},{"url": "https://hive.blog/witness-category/@patrice/patrice-witness-application", "probability": 0.93415666, "headline": "@patrice Witness Application", "datePublished": "2017-04-24T04:28:49.331818", "datePublishedRaw": "6 years ago", "author": "patrice", "authorsList": ["patrice"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://i.imgur.com/LplF9EQ.png", "images": ["https://images.hive.blog/768x0/https://i.imgur.com/LplF9EQ.png"], "description": "Wow! Things have changed a lot for me in the last year. 4 Reasons You Should Vote For @patrice As A Witness Today! I wasn't expecting so much support from everyone. Neither\u2026 by patrice", "articleBody": "Wow! Things have changed a lot for me in the last year.\n\nI wasn't expecting so much support from everyone. Neither was I expecting to be number #83 on the witness list by the time I was able to write my witness proposal. I really appreciate the support everyone has shown me.\n\nsource\n\nWhen I started my journey here on Steemit I was searching for something to do with myself while I recovered from a recent illness. I never dreamed I'd find a something that 'fit' me.\n\nI took a few months hiatus but since my return 4 months ago I've made Steemit my full time job. I've been content to work behind the scenes and post on issues I think are important in the community. I'm not one to seek the spotlight and honestly without the support and encouragement of @stellabelle and a fellow member of SteemCleaners @anyx I probably wouldn't have decided to run for witness.\n\nAs I said in a previous post I look at this as a serious undertaking whether or not a witness is number #200 or #1.\n\nSomeone asked me why I want to be a witness... For me it isn't about being a witness, it is how I can make a bigger difference in the community by becoming a witness. I want to work with the already great group of witnesses we have and initiatives on Steemit to pool resources to not only fight abuse but support new members.\n\n@steemcleaners\n\nI am a founding member of @steemcleaners. In the recent months I've started developing steemcleaners.org to be used as a tool for reporting and handling abuse. With the influx of users will come more spam & abuse. We need to have in place a scalable solution.\n\n@spaminator\n\nI created @spaminator to deal with a specific area abuse and do it somewhat differently than @steemcleaners. @spaminator takes a less hands on approach to most issues and will illicit community support and involvement. It's also a way for everyone in the community to voice their opinion on what spam is and how it should be dealt with.\n\nWe've recently added a new member to the team with some programing skills & we look forward to developing tools to not only stop spammers but help educate new users coming in from other social platforms that need guidance.\n\nShort Bio\n\nMy name is Patricia. I'm 44 years old and a grandmother of two living in the Mid south, USA.\n\nI quit high school when I was 18 and started college majoring in accounting. After a year and with the birth of my son I quit college and went to work bar-tending.\n\nI've worked in several fields and have held numerous jobs over the years as I\u2019ve moved every 2 to 6 years. I currently hold a CDL drove 18 wheelers when I was younger. I absolutely loved the few years I worked as a school bus driver for special education students.\n\nIn the late 90's I was hired as a computer tech at CompUSA, later I worked at a local computer store and eventually found my way to a business process outsourcing company where I was hire as their network administrator.\n\nI worked in the cigar industry until 2010 when I returned to the Mid South. Not long after I returned home I was contacted by a friend who needed someone to freelance for a startup company providing help desk support and remote network monitoring, unfortunately they closed their doors earlier this year.\n\nVote\n\nScroll to the bottom of the page and add patrice under \"If you would like to vote for a witness outside of the top 50, enter the account name below to cast a vote.\"", "articleBodyHtml": "
\n\n

Wow! Things have changed a lot for me in the last year.

\n\n

I wasn't expecting so much support from everyone. Neither was I expecting to be number #83 on the witness list by the time I was able to write my witness proposal. I really appreciate the support everyone has shown me.

\n\n


\nsource

\n\n

When I started my journey here on Steemit I was searching for something to do with myself while I recovered from a recent illness. I never dreamed I'd find a something that 'fit' me.

\n\n

I took a few months hiatus but since my return 4 months ago I've made Steemit my full time job. I've been content to work behind the scenes and post on issues I think are important in the community. I'm not one to seek the spotlight and honestly without the support and encouragement of @stellabelle and a fellow member of SteemCleaners @anyx I probably wouldn't have decided to run for witness.

\n\n

As I said in a previous post I look at this as a serious undertaking whether or not a witness is number #200 or #1.

\n\n

Someone asked me why I want to be a witness... For me it isn't about being a witness, it is how I can make a bigger difference in the community by becoming a witness. I want to work with the already great group of witnesses we have and initiatives on Steemit to pool resources to not only fight abuse but support new members.

\n\n

@steemcleaners

\n\n

I am a founding member of @steemcleaners. In the recent months I've started developing steemcleaners.org to be used as a tool for reporting and handling abuse. With the influx of users will come more spam & abuse. We need to have in place a scalable solution.

\n\n

@spaminator

\n\n

I created @spaminator to deal with a specific area abuse and do it somewhat differently than @steemcleaners. @spaminator takes a less hands on approach to most issues and will illicit community support and involvement. It's also a way for everyone in the community to voice their opinion on what spam is and how it should be dealt with.

\n\n

We've recently added a new member to the team with some programing skills & we look forward to developing tools to not only stop spammers but help educate new users coming in from other social platforms that need guidance.

\n\n

Short Bio

\n\n

My name is Patricia. I'm 44 years old and a grandmother of two living in the Mid south, USA.

\n\n

I quit high school when I was 18 and started college majoring in accounting. After a year and with the birth of my son I quit college and went to work bar-tending.

\n\n

I've worked in several fields and have held numerous jobs over the years as I\u2019ve moved every 2 to 6 years. I currently hold a CDL drove 18 wheelers when I was younger. I absolutely loved the few years I worked as a school bus driver for special education students.

\n\n

In the late 90's I was hired as a computer tech at CompUSA, later I worked at a local computer store and eventually found my way to a business process outsourcing company where I was hire as their network administrator.

\n\n

I worked in the cigar industry until 2010 when I returned to the Mid South. Not long after I returned home I was contacted by a friend who needed someone to freelance for a startup company providing help desk support and remote network monitoring, unfortunately they closed their doors earlier this year.

\n\n

Vote

\n\n

Scroll to the bottom of the page and add patrice under \"If you would like to vote for a witness outside of the top 50, enter the account name below to cast a vote.\"

\n\n
", "canonicalUrl": "https://steemit.com/witness-category/@patrice/patrice-witness-application"},{"url": "https://hive.blog/witness-category/@bdcommunity/announcing-bdcommunity-hive-witness", "probability": 0.9543544, "headline": "Announcing BDCommunity Hive Witness", "datePublished": "2020-04-24T04:28:52.298351", "datePublishedRaw": "3 years ago", "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://images.hive.blog/DQmX4ANaTxUjWi5k6NmBD7uCk9y3haQ294z8f5gXDGwxpa4/bdc-witness.png", "images": ["https://images.hive.blog/DQmX4ANaTxUjWi5k6NmBD7uCk9y3haQ294z8f5gXDGwxpa4/bdc-witness.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/mattsanthonyit/JVn20D0I-Screenshot_20200411-013822_1.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/javiermurillo/ahMFiE3X-imagen.png", "https://images.hive.blog/768x0/https://files.peakd.com/file/peakd-hive/mattsanthonyit/XLOJ6DsN-Screenshot_20200411-013803_1.png", "https://images.hive.blog/DQmZiFCHkAjKe1dExH3xTw7aqSD4C9gampBaorLxzdbU66z/bdc-peakd.png"], "description": "BDCommunity now is a Hive Witness. We have produced our first block #42355054 2 days ago and produced a total of 3 until now. Our full rank is #111 and active rank is #95.\u2026 by bdcommunity", "articleBody": "BDCommunity now is a Hive Witness. We have produced our first block #42355054 2 days ago and produced a total of 3 until now. Our full rank is #111 and active rank is #95.\n\nWhat\u2019s a witness?\n\nHive Witness is a person or a group of persons who run the Hive Blockchain software on a computer which produces blocks on the Hive Blockchain. They also publish HIVE price feed which helps the blockchain determine post payouts and such.\n\nWitnesses are elected by the community through a mechanism called delegated proof of stake. Witnesses receive HIVE POWER as rewards for producing blocks (processing transactions).\n\nWhy a new witness?\n\nFor a decentralized network like Hive Blockchain, the more people run the software and validates the transactions, the better. In our mind, a witness is a voice or representative of a community. We felt that our community is under-presented in the blockchain and we plan to alter that. We prefer to give small players a voice. We like the say to the greater hive community that consider us as friends, consider us as your representative so that we can voice your opinion to the governing body of this blockchain.\n\nWhat are the configuration of the witness server?\n\nWe are running the witness node on VPS from @privex with the following configuration.\n\nCPU: 4 Core RAM: 8 GB SSD: 500 GB OS: Ubuntu 18.04\n\nThis basic setup is for $50/month. The monthly expense is substantial in the developing world, however, we are willing to sacrifice that for the visibility and representation. We need your support. Until we qualify for the top 20, we can\u2019t represent you to the full extent. So that should be our common goal.\n\nRunning price feed from a Vultr VPS. At this moment we do not have a backup witness node but planning to get one in the future.\n\nWhat\u2019s in it for BDCommunity?\n\nBDCommunity has been on the chain for almost 2 years curating contents, helping and representing Bangladeshi and Bengali speaking community. Now we want to show your support for the chain by taking part in the governance and security of the chain.\n\nWhat\u2019s in it for the Hive community?\n\nWe like for the greater community to enable us as a sounding board, not only as a representative of a specific part of the world but also to rise above the tunnel vision that is typical of many blockchain projects. We want to bring something new to the table. This is in the form of bringing emerging content producers; writers, artists, music producers, comedians that we see in our everyday life, to the hive blockchain. This will not only enrich us at a personal level but it will give the small players a voice, a vision, a unique experience. For investors, and venture capitalists; we like to offer competitive return-on-investment without the hassle in daily monitoring. We are manual curators and will always be the human touch on this blockchain.\n\nHow can an individual be involved?\n\nHive being a DPOS blockchain, every user has a say in the governance of the chain proportional to their HP. We have recently proved that every vote counts. So, vote us as a witness or set us as your witness voting proxy.\n\nWho are running the witness?\n\n@reazuliqbal - Full-stack freelance developer. Developer and Moderator of BDCommunity.\n@zaku - Business and idea guy. Moderator of BDCommunity.\n\nWhat have we done previously?\n\nContent Curation\n\nBDCommunity started as a content curation project for Bengali speaking community as there weren't any back then. Now we curate contents from both Bengali and non-Bengali users using the community curation account and sister project @bdvoter.\n\nOnboarding\n\nWe create free accounts for legitimate users who want to join the Blockchain and help them understand the basics and norms of the chain and how to secure their accounts.\n\nTools\n\nProjects\n\nMonsterMarket - Splinterlands cards and pack marketplace.\nBDExchange - A Discord HIVE and STEEM, and Steem-Engine wallet. Which as provides escrow services for peer to peer transactions.\nBDVoter (@bdvoter) - For-profit curation project.\n\nWhat will you do in the future?\n\nWe are planning to enhance our curation projects and develop more tools and projects on top of the Hive Blockchain.\n\nWhat is the mission statement?\n\nBDC has a vision to curate and support of content across all over the hive interface. There is a desire to on-board not only new members but members who are emerging players in their respective fields. Introduce the power of social media on the blockchain to already established or emerging cultural leaders of the broader ecosystem is what BDC plans to achieve in the near future. This helps Hive community in 2 ways.... first: enrichment of the platform itself; and second: the spread of 'hive' to a more prominent segment of the social media by enrichment via the 'star content producers'\n\nHow do I vote you as a Witness?\n\nHivesigner\n\nbdcommunity at the form below the witness list.\n\nPeakd.com\n\nPlease go to https://peakd.com/witnesses and search for bdcommunity Click the check button to vote for us.", "articleBodyHtml": "
\n\n
\"bdc-witness.png\"
\n\n

BDCommunity now is a Hive Witness. We have produced our first block #42355054 2 days ago and produced a total of 3 until now. Our full rank is #111 and active rank is #95.

\n\n

What\u2019s a witness?

\n\n

Hive Witness is a person or a group of persons who run the Hive Blockchain software on a computer which produces blocks on the Hive Blockchain. They also publish HIVE price feed which helps the blockchain determine post payouts and such.

\n\n

Witnesses are elected by the community through a mechanism called delegated proof of stake. Witnesses receive HIVE POWER as rewards for producing blocks (processing transactions).

\n\n

Why a new witness?

\n\n

For a decentralized network like Hive Blockchain, the more people run the software and validates the transactions, the better. In our mind, a witness is a voice or representative of a community. We felt that our community is under-presented in the blockchain and we plan to alter that. We prefer to give small players a voice. We like the say to the greater hive community that consider us as friends, consider us as your representative so that we can voice your opinion to the governing body of this blockchain.

\n\n

What are the configuration of the witness server?

\n\n

We are running the witness node on VPS from @privex with the following configuration.

\n\n
CPU: 4 Core\nRAM: 8 GB\nSSD: 500 GB\nOS: Ubuntu 18.04\n
\n\n

This basic setup is for $50/month. The monthly expense is substantial in the developing world, however, we are willing to sacrifice that for the visibility and representation. We need your support. Until we qualify for the top 20, we can\u2019t represent you to the full extent. So that should be our common goal.

\n\n

Running price feed from a Vultr VPS. At this moment we do not have a backup witness node but planning to get one in the future.

\n\n

What\u2019s in it for BDCommunity?

\n\n

BDCommunity has been on the chain for almost 2 years curating contents, helping and representing Bangladeshi and Bengali speaking community. Now we want to show your support for the chain by taking part in the governance and security of the chain.

\n\n

What\u2019s in it for the Hive community?

\n\n

We like for the greater community to enable us as a sounding board, not only as a representative of a specific part of the world but also to rise above the tunnel vision that is typical of many blockchain projects. We want to bring something new to the table. This is in the form of bringing emerging content producers; writers, artists, music producers, comedians that we see in our everyday life, to the hive blockchain. This will not only enrich us at a personal level but it will give the small players a voice, a vision, a unique experience. For investors, and venture capitalists; we like to offer competitive return-on-investment without the hassle in daily monitoring. We are manual curators and will always be the human touch on this blockchain.

\n\n

How can an individual be involved?

\n\n

Hive being a DPOS blockchain, every user has a say in the governance of the chain proportional to their HP. We have recently proved that every vote counts. So, vote us as a witness or set us as your witness voting proxy.

\n\n

Who are running the witness?

\n\n

@reazuliqbal - Full-stack freelance developer. Developer and Moderator of BDCommunity.
\n@zaku - Business and idea guy. Moderator of BDCommunity.

\n\n

What have we done previously?

\n\n

Content Curation

\n\n

BDCommunity started as a content curation project for Bengali speaking community as there weren't any back then. Now we curate contents from both Bengali and non-Bengali users using the community curation account and sister project @bdvoter.

\n\n

Onboarding

\n\n

We create free accounts for legitimate users who want to join the Blockchain and help them understand the basics and norms of the chain and how to secure their accounts.

\n\n

Tools

\n\n

Projects

\n\n
  • MonsterMarket - Splinterlands cards and pack marketplace.
  • \n
  • BDExchange - A Discord HIVE and STEEM, and Steem-Engine wallet. Which as provides escrow services for peer to peer transactions.
  • \n
  • BDVoter (@bdvoter) - For-profit curation project.
\n\n

What will you do in the future?

\n\n

We are planning to enhance our curation projects and develop more tools and projects on top of the Hive Blockchain.

\n\n

What is the mission statement?

\n\n

BDC has a vision to curate and support of content across all over the hive interface. There is a desire to on-board not only new members but members who are emerging players in their respective fields. Introduce the power of social media on the blockchain to already established or emerging cultural leaders of the broader ecosystem is what BDC plans to achieve in the near future. This helps Hive community in 2 ways.... first: enrichment of the platform itself; and second: the spread of 'hive' to a more prominent segment of the social media by enrichment via the 'star content producers'

\n\n

How do I vote you as a Witness?

\n\n

Hivesigner

\n\n

bdcommunity at the form below the witness list.

\n\n
\"vote-bdc.png\"
\n\n

Peakd.com

\n\n

Please go to https://peakd.com/witnesses and search for bdcommunity Click the check button to vote for us.

\n\n
\"bdc-peakd.png\"
\n\n
", "canonicalUrl": "https://hive.blog/witness-category/@bdcommunity/announcing-bdcommunity-hive-witness"},{"url": "https://hive.blog/witness-category/@jamzed/hello-i-m-jamzed-and-i-would-like-to-become-a-witness", "probability": 0.9421786, "headline": "Hello, I'm jamzed and I would like to become a witness ;-)", "datePublished": "2018-04-24T04:29:29.596041", "datePublishedRaw": "5 years ago", "author": "jamzed", "authorsList": ["jamzed"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://steemitimages.com/DQmTfqjfaTVBdwajYfaWZpSaAaVJ4nkhoDTLptFcro96QbU/jamzed.jpg", "images": ["https://images.hive.blog/768x0/https://steemitimages.com/DQmTfqjfaTVBdwajYfaWZpSaAaVJ4nkhoDTLptFcro96QbU/jamzed.jpg", "https://images.hive.blog/768x0/https://steemitimages.com/DQmUhzMLBxrESBZMKuWDATfo4GVfaMid5kqVpsHF7S7d3Rj/witness_vote.png"], "description": "# whoami My name is Patryk, I'm 34 years old and I live in s/Krakow/Cracow/ Poland. I'm a father and husband. I work as Site Reliability Engineer in Security and my primary\u2026 by jamzed", "articleBody": "# whoami\n\nMy name is Patryk, I'm 34 years old and I live in s/Krakow/Cracow/ Poland. I'm a father and husband. I work as Site Reliability Engineer in Security and my primary focus is always on Performance, Security and High Availability.\n\nMy father brought the first computer ATARI 65XE into the home when I was 3... and this is the first thing I remember from my childhood. ;-) the next computers were Amiga 600 when I was 8 and finally the first real PC (Intel Pentium MMX 233 MHz) when I was 15. Currently I'm using Mac on primary workstation and Linux on all of my servers. Definitely computers changed me and my life... I've spent 90% of my time on this planet around the keyboards and monitors... and I still love it. ;-)\n\n# witness host\n\nWitness node is located in France and hosted by OVH,\n\n4 cores, 8 threads (dedicated server, passmark benchmark > 9k)\n32GB RAM\nSSDs in RAID\nfor security and monitoring: munin, rkhunter, fail2ban, iptables (with DROP as default policy), zabbix\nfeed price update is automated by steemfeed [thx @clayop for this tool]\nserver and services are monitored from external host (I'm going to setup backup node as soon I will start signing the first blocks) ;-)\n\n# my goals and challenges as witness\n\nAs witness my primary goals will be to support Steemit community, understand all details of the network to make best decisions when it comes to HARDFORKs and share my technical knowledge and experience I gathered by the last 15+ years in IT by working as Network Specialist, System Administrator, Infrastructure Architect and SRE. I see this as great opportunity and challenge for me.\n\n# votes...\n\nIf you want to give me a chance to be a part of this awesome journey please vote for me,\n\nPlease also feel free to reach out to me on steemit.chat anytime ;-)\n\n:wq!", "articleBodyHtml": "
\n\n

# whoami

\n\n

My name is Patryk, I'm 34 years old and I live in s/Krakow/Cracow/ Poland. I'm a father and husband. I work as Site Reliability Engineer in Security and my primary focus is always on Performance, Security and High Availability.

\n\n
\"jamzed.jpg\"
\n\n

My father brought the first computer ATARI 65XE into the home when I was 3... and this is the first thing I remember from my childhood. ;-) the next computers were Amiga 600 when I was 8 and finally the first real PC (Intel Pentium MMX 233 MHz) when I was 15. Currently I'm using Mac on primary workstation and Linux on all of my servers. Definitely computers changed me and my life... I've spent 90% of my time on this planet around the keyboards and monitors... and I still love it. ;-)

\n\n

# witness host

\n\n

Witness node is located in France and hosted by OVH,

\n\n
  • 4 cores, 8 threads (dedicated server, passmark benchmark > 9k)
  • \n
  • 32GB RAM
  • \n
  • SSDs in RAID
  • \n
  • for security and monitoring: munin, rkhunter, fail2ban, iptables (with DROP as default policy), zabbix
  • \n
  • feed price update is automated by steemfeed [thx @clayop for this tool]
  • \n
  • server and services are monitored from external host (I'm going to setup backup node as soon I will start signing the first blocks) ;-)
\n\n

# my goals and challenges as witness

\n\n

As witness my primary goals will be to support Steemit community, understand all details of the network to make best decisions when it comes to HARDFORKs and share my technical knowledge and experience I gathered by the last 15+ years in IT by working as Network Specialist, System Administrator, Infrastructure Architect and SRE. I see this as great opportunity and challenge for me.

\n\n

# votes...

\n\n

If you want to give me a chance to be a part of this awesome journey please vote for me,

\n\n
\"witness_vote.png\"
\n\n

Please also feel free to reach out to me on steemit.chat anytime ;-)

\n\n

:wq!

\n\n
", "canonicalUrl": "https://steemit.com/witness-category/@jamzed/hello-i-m-jamzed-and-i-would-like-to-become-a-witness"},{"url": "https://hive.blog/witness-category/@helo/steemit-quebec-witness", "probability": 0.9306386, "headline": "Steemit Quebec Witness", "datePublished": "2018-04-24T04:29:41.581335", "datePublishedRaw": "5 years ago", "author": "helo", "authorsList": ["helo"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://steemitimages.com/DQmdo4sYwzhBf2iL78CjBQ1WGxuyge69qUtUffebhNj3Kas/helo-for-witness.png", "images": ["https://images.hive.blog/768x0/https://steemitimages.com/DQmdo4sYwzhBf2iL78CjBQ1WGxuyge69qUtUffebhNj3Kas/helo-for-witness.png"], "description": "Hello Steemians, I am submitting my services as Steemit Quebec first Witness. I got involved with Steemit Quebec and EOS Quebec thanks to @pnc and I plan to present and help\u2026 by helo", "articleBody": "Hello Steemians, I am submitting my services as Steemit Quebec first Witness.\n\nI got involved with Steemit Quebec and EOS Quebec thanks to @pnc and I plan to present and help organise meetups in and around Montreal.\n\nI'm a DevOps with over 20 years of experience in the field. I've administered Unix / Linux systems since 1994 and began programming at the tender age of 10 with the Commodore VIC-20.\n\nI was introduced to Steemit by @jerrybanfield last June and I've learned to appreciate the value it brings to the world and I've been looking for ways to contribute ever since.\n\nServer Specifications\n\nI've chosen a powerful server with a lot of ram and fast drive to be future proof as steem will for sure take off in a big way in 2018.\n\nSecurity\n\nRemote Hosting in Canada\nSSH keys to login\nPassword authentication disabled\nRoot login disabled\n\nSettings\n\nHere are some of my latest contributions to steem\n\nYou will find that most of my post are done on @utopian-io and for that reason I would like to thank @elear for his contributions and dedication; I feel right at home here.\n\nThe server went live on December 26th 2017 and has been ready to step in to make blocks ever since. Many thanks to @jamzed in the witness chat for helping me setup my witness config and answering my questions.\n\nThe future\n\nI plan to setup a seed node and a backup witness server soon.", "articleBodyHtml": "
\n\n
\"helo-for-witness.png\"
\n\n

Hello Steemians, I am submitting my services as Steemit Quebec first Witness.

\n\n

I got involved with Steemit Quebec and EOS Quebec thanks to @pnc and I plan to present and help organise meetups in and around Montreal.

\n\n

I'm a DevOps with over 20 years of experience in the field. I've administered Unix / Linux systems since 1994 and began programming at the tender age of 10 with the Commodore VIC-20.

\n\n

I was introduced to Steemit by @jerrybanfield last June and I've learned to appreciate the value it brings to the world and I've been looking for ways to contribute ever since.

\n\n

Server Specifications

\n\n

I've chosen a powerful server with a lot of ram and fast drive to be future proof as steem will for sure take off in a big way in 2018.

\n\n

Security

\n\n
  • Remote Hosting in Canada
  • \n
  • SSH keys to login
  • \n
  • Password authentication disabled
  • \n
  • Root login disabled
\n\n

Settings

\n\n

Here are some of my latest contributions to steem

\n\n

You will find that most of my post are done on @utopian-io and for that reason I would like to thank @elear for his contributions and dedication; I feel right at home here.

\n\n

The server went live on December 26th 2017 and has been ready to step in to make blocks ever since. Many thanks to @jamzed in the witness chat for helping me setup my witness config and answering my questions.

\n\n

The future

\n\n

I plan to setup a seed node and a backup witness server soon.

\n\n
", "canonicalUrl": "https://steemit.com/witness-category/@helo/steemit-quebec-witness"},{"url": "https://hive.blog/witness-category/@firepower/why-you-should-firepower-as-witness-witness-campaign-post-from-india", "probability": 0.9740036, "headline": "Why You Should Vote For @firepower As Witness\u2014Witness Campaign Post From India!", "datePublished": "2018-04-24T04:29:43.343909", "datePublishedRaw": "5 years ago", "author": "firepower", "authorsList": ["firepower"], "inLanguage": "en", "mainImage": "https://images.hive.blog/1200x630/https://steemitimages.com/0x0/https://i.imgur.com/527KQrz.jpg", "images": ["https://images.hive.blog/768x0/https://steemitimages.com/0x0/https://i.imgur.com/527KQrz.jpg", "https://images.hive.blog/768x0/https://steemitimages.com/DQmTZ9vZ1BJuSJ1qJa9aPH18fqjPJaogrPKL3NGsqfERtuc/upvote.png", "https://images.hive.blog/768x0/https://scontent-mad1-1.xx.fbcdn.net/v/t1.0-9/23031294_1643084745774070_4850131618427137775_n.jpg?oh=5d09fe44849937e144034071be8845b1&oe=5B4B8D8B", "https://images.hive.blog/768x0/https://ih1.redbubble.net/image.33189030.5801/flat,800x800,075,f.u3.jpg", "https://images.hive.blog/768x0/https://i.imgur.com/JLPzkK9.png", "https://images.hive.blog/768x0/https://steemitimages.com/DQmWf9NmeTafxjSNyxxwFAhELzEzZ6Qx26LBVqZL1JpKvQ9/image.png", "https://images.hive.blog/768x0/https://steemitimages.com/DQmbdDBV9DFgKfSGAbizZ7hMLDnnDheAZyhXfXRLu1P6tMs/image.png"], "description": "In a handful of months from now I would complete 2 years on Steemit and what a ride it has been so far! Suffice to say I fell in love with this platform and Steem blockchain very\u2026 by firepower", "articleBody": "In a handful of months from now I would complete 2 years on Steemit and what a ride it has been so far! Suffice to say I fell in love with this platform and Steem blockchain very early on! In the past year I\u2019ve made it my full time job to evangelise this incredible platform and technology which as we know has the potential to change the face of social media and the world if it is properly developed.\n\nSummary of my Activity on Steemit\n\nI\u2019ve been focused on the community building aspect of the platform since the day 1. I\u2019ve curated for top whales, and have helped start initiatives such as the SteemCleaners project, Robinhood Whale and have been core team member of the incredible Project Curie. I\u2019m also an active admin of Steemit.Chat from it\u2019s early days where I've assisted many users over the past year and half.\n\nI\u2019ve also had the pleasure of helping @roelandp and Steemit Inc in the perfect execution of two SteemFests with my amazing team\u2014 firepowercrew. I've also sponsored and subsidised tickets to SteemFest for a few users. My work is largely detailed on the platform. I\u2019ve also assisted users such in recovering their accounts.\n\nRecently I\u2019ve been building up the Indian community. My initiatives have helped in growing and retaining userbase from this region in the past year. I've also been able to inspire and motivate many others to do similar things in their areas.\n\nThe feedback I've received from past year of work on the platform have made me move in this new direction. I love onboarding content creators and investors onto our platform. India for example is the biggest userbase on Facebook today, and it would be great to see such numbers on Steemit.\n\nTherefore, I felt it would be a logical step forward in my work on Steemit to run witness nodes.\n\nBecoming a Witness To The Amazing Steem Blockchain\n\nFor the past two weeks I've been running a primary witness node, backup for failover and a seed node on excellent dedicated server hardware. I should have started this long ago and I could have achieved a lot more for our blockchain and Steemit.com but better late than never eh? I also got a good push in motivation due to a few users recommending me to do this over the past year.\n\nRunning a successful witness will allow me to contribute to the smooth functioning of our blockchain, as well as fund further community initiatives and better support growth initiatives from potentially one of the biggest markets for Steem in the world.\n\nI think it\u2019s great to have witnesses (top and backup) to the blockchain who are not only developers, but also marketers, social leaders and community builders who can elevate this blockchain technology to the next level and take it across the globe. In my opinion this is a healthy mix.\n\nMy role as a community builder is all about bridging the gap between developers, their product and end users who can make use of these amazing technologies in their daily lives. Because in the real world, you can build anything and all you want, but it doesn't mean people will simply come over to use the product.\n\nBackup Witness Node\nRam 32 GB DDR4\nCPU Quad-Core CPU\nDisks 2x500 GB SSD\nInternet Connection 1 Gigabit\nSeed Node\nURL seed.firepower.ltd:2001\nRam 32 GB DDR3\nCPU Intel Xeon E3\nDisks 2x120 GB SSD\nInternet Connection 1 Gigabit\n\nI\u2019m happy to announce that, I\u2019m off to a good start and I\u2019m currently in 104th position. I have been signing blocks for about 2 weeks now. I wanted to take my systems for a test ride and ensure everything is running perfectly before making this post. So far things are running perfectly fine. It wasn't as complicated as its generally made out to be, and I\u2019m constantly learning more.\n\nMy dedicated servers are located at strategic locations ensuring excellent connectivity. They have been hardened to prevent any security issues. I\u2019m open to scaling up the hardware based on the needs of the blockchain. They have been setup from the ground-up from scratch to do their job perfectly. I\u2019ve missed ZERO blocks since I\u2019ve started my witness 2 weeks ago and you can expect reliable witness nodes from me.\n\nHow To Vote for My Witness\n\nPlease do not capitalise any letter when voting.\n\nfirepower in the box. You can vote for 30 witnesses and it doesn\u2019t cost you anything.\n\nMoving up the ranks will allow me to sustain my costs for running the server, fund my projects and continue being a reliable witness to the blockchain and sign greater number of blocks without issues.\n\nWould you like to proxy me?\n\nSome users have gone ahead and proxied me as they felt I'm capable of deciding other witnesses for the blockchain on their behalf. It came as a surprise but I felt it was great that many have put their trust in me. By proxying me you will be putting an experienced and established user on Steemit in-charge of voting other reliable and trustworthy witnesses for the job of maintaining the blockchain.\n\nTherefore, If you want me to be a witness proxy for your account and vote on other witnesses on your behalf then you can do it from the same page using your active or master key. Again none of this will cost you anything and takes only few seconds to do it.\n\nI will continue promoting Steem blockchain and cryptocurrency (Communities and SMT when they arrive) along with all the great apps on it such as Steemit, DTube, Busy.org, Utopian, Steepshot, Esteem mobile app, Chain-bb, Zappl and others.\n\nSome Future Plans\n\nI\u2019m trying to get Indian exchanges interested in listing a Steem/INR pair. In fact I've had a rep from the top Indian bitcoin exchange sit unofficially in a recent meetup to understand the response towards Steem cryptocurrency and blockchain. Overall the feedback received was positive. But we\u2019ve got some work to do!\n\nI\u2019m currently gathering information on the legal and financial regulations that goes into running a cryptocurrency exchange in India-for some larger plans I have in the pipeline. Running a successful witness will certainly help with these endeavours in a big way in the long run or at the very least there it will be a good learning curve.\n\nNothing would make me happier than my countrymen and women being able to trade into Steem directly with Rupees! I\u2019m certain it would happen someday and it would help in faster on-boarding of investors like never before.\n\nHowever, I won't be stopping here, in fact I won't stop until I connect Steemians across the world with each other and bring users who have never heard of this empowering technology onto the various apps on the Steem blockchain.\n\nA Few Contributions To Steemit\n\nYou may look at some of my contributions to the platform to learn more about the work I\u2019ve put in on Steemit over the past year and half and I intend to do a lot more of these in the years to come!", "articleBodyHtml": "
\n\n

In a handful of months from now I would complete 2 years on Steemit and what a ride it has been so far! Suffice to say I fell in love with this platform and Steem blockchain very early on! In the past year I\u2019ve made it my full time job to evangelise this incredible platform and technology which as we know has the potential to change the face of social media and the world if it is properly developed.

\n\n
\n\n

Summary of my Activity on Steemit

\n\n

I\u2019ve been focused on the community building aspect of the platform since the day 1. I\u2019ve curated for top whales, and have helped start initiatives such as the SteemCleaners project, Robinhood Whale and have been core team member of the incredible Project Curie. I\u2019m also an active admin of Steemit.Chat from it\u2019s early days where I've assisted many users over the past year and half.

\n\n

I\u2019ve also had the pleasure of helping @roelandp and Steemit Inc in the perfect execution of two SteemFests with my amazing team\u2014firepowercrew. I've also sponsored and subsidised tickets to SteemFest for a few users. My work is largely detailed on the platform. I\u2019ve also assisted users such in recovering their accounts.

\n\n

Recently I\u2019ve been building up the Indian community. My initiatives have helped in growing and retaining userbase from this region in the past year. I've also been able to inspire and motivate many others to do similar things in their areas.

\n\n

The feedback I've received from past year of work on the platform have made me move in this new direction. I love onboarding content creators and investors onto our platform. India for example is the biggest userbase on Facebook today, and it would be great to see such numbers on Steemit.

\n\n

Therefore, I felt it would be a logical step forward in my work on Steemit to run witness nodes.

\n\n

Becoming a Witness To The Amazing Steem Blockchain

\n\n

For the past two weeks I've been running a primary witness node, backup for failover and a seed node on excellent dedicated server hardware. I should have started this long ago and I could have achieved a lot more for our blockchain and Steemit.com but better late than never eh? I also got a good push in motivation due to a few users recommending me to do this over the past year.

\n\n

Running a successful witness will allow me to contribute to the smooth functioning of our blockchain, as well as fund further community initiatives and better support growth initiatives from potentially one of the biggest markets for Steem in the world.

\n\n

I think it\u2019s great to have witnesses (top and backup) to the blockchain who are not only developers, but also marketers, social leaders and community builders who can elevate this blockchain technology to the next level and take it across the globe. In my opinion this is a healthy mix.

\n\n

My role as a community builder is all about bridging the gap between developers, their product and end users who can make use of these amazing technologies in their daily lives. Because in the real world, you can build anything and all you want, but it doesn't mean people will simply come over to use the product.

\n\n \n \n \n \n \n \n \n \n
Backup Witness Node
Ram 32 GB DDR4
CPU Quad-Core CPU
Disks 2x500 GB SSD
Internet Connection 1 Gigabit
\n\n \n \n \n \n \n \n \n \n \n
Seed Node
URL seed.firepower.ltd:2001
Ram 32 GB DDR3
CPU Intel Xeon E3
Disks 2x120 GB SSD
Internet Connection 1 Gigabit
\n\n

I\u2019m happy to announce that, I\u2019m off to a good start and I\u2019m currently in 104th position. I have been signing blocks for about 2 weeks now. I wanted to take my systems for a test ride and ensure everything is running perfectly before making this post. So far things are running perfectly fine. It wasn't as complicated as its generally made out to be, and I\u2019m constantly learning more.

\n\n

My dedicated servers are located at strategic locations ensuring excellent connectivity. They have been hardened to prevent any security issues. I\u2019m open to scaling up the hardware based on the needs of the blockchain. They have been setup from the ground-up from scratch to do their job perfectly. I\u2019ve missed ZERO blocks since I\u2019ve started my witness 2 weeks ago and you can expect reliable witness nodes from me.

\n\n

How To Vote for My Witness

\n\n

Please do not capitalise any letter when voting.

\n\n

firepower in the box. You can vote for 30 witnesses and it doesn\u2019t cost you anything.

\n\n
\n\n


\n

\n\n

Moving up the ranks will allow me to sustain my costs for running the server, fund my projects and continue being a reliable witness to the blockchain and sign greater number of blocks without issues.

\n\n

Would you like to proxy me?

\n\n

Some users have gone ahead and proxied me as they felt I'm capable of deciding other witnesses for the blockchain on their behalf. It came as a surprise but I felt it was great that many have put their trust in me. By proxying me you will be putting an experienced and established user on Steemit in-charge of voting other reliable and trustworthy witnesses for the job of maintaining the blockchain.

\n\n
\n\n

Therefore, If you want me to be a witness proxy for your account and vote on other witnesses on your behalf then you can do it from the same page using your active or master key. Again none of this will cost you anything and takes only few seconds to do it.

\n\n

I will continue promoting Steem blockchain and cryptocurrency (Communities and SMT when they arrive) along with all the great apps on it such as Steemit, DTube, Busy.org, Utopian, Steepshot, Esteem mobile app, Chain-bb, Zappl and others.

\n\n

Some Future Plans

\n\n

I\u2019m trying to get Indian exchanges interested in listing a Steem/INR pair. In fact I've had a rep from the top Indian bitcoin exchange sit unofficially in a recent meetup to understand the response towards Steem cryptocurrency and blockchain. Overall the feedback received was positive. But we\u2019ve got some work to do!

\n\n

I\u2019m currently gathering information on the legal and financial regulations that goes into running a cryptocurrency exchange in India-for some larger plans I have in the pipeline. Running a successful witness will certainly help with these endeavours in a big way in the long run or at the very least there it will be a good learning curve.

\n\n

Nothing would make me happier than my countrymen and women being able to trade into Steem directly with Rupees! I\u2019m certain it would happen someday and it would help in faster on-boarding of investors like never before.

\n\n

However, I won't be stopping here, in fact I won't stop until I connect Steemians across the world with each other and bring users who have never heard of this empowering technology onto the various apps on the Steem blockchain.

\n\n

A Few Contributions To Steemit

\n\n

You may look at some of my contributions to the platform to learn more about the work I\u2019ve put in on Steemit over the past year and half and I intend to do a lot more of these in the years to come!

\n\n
", "canonicalUrl": "https://steemit.com/witness-category/@firepower/why-you-should-firepower-as-witness-witness-campaign-post-from-india"}]